[ 505.738719] nova-compute[62112]: Invalid -W option ignored: invalid action: '"ignore' [ 507.720438] nova-compute[62112]: DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' {{(pid=62112) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 507.720879] nova-compute[62112]: DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' {{(pid=62112) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 507.720961] nova-compute[62112]: DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' {{(pid=62112) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 507.721240] nova-compute[62112]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 507.846442] nova-compute[62112]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=62112) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:390}} [ 507.854987] nova-compute[62112]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.009s {{(pid=62112) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:428}} [ 507.966671] nova-compute[62112]: INFO nova.virt.driver [None req-820c1302-dd5a-4192-b018-5f9b79830987 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 508.044981] nova-compute[62112]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=62112) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 508.045458] nova-compute[62112]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.002s {{(pid=62112) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 508.045702] nova-compute[62112]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=62112) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 508.059802] nova-compute[62112]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 508.059802] nova-compute[62112]: warnings.warn( [ 508.064814] nova-compute[62112]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 508.064814] nova-compute[62112]: warnings.warn( [ 509.029406] nova-compute[62208]: Invalid -W option ignored: invalid action: '"ignore' [ 511.071241] nova-compute[62208]: DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' {{(pid=62208) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 511.071644] nova-compute[62208]: DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' {{(pid=62208) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 511.071710] nova-compute[62208]: DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' {{(pid=62208) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 511.072028] nova-compute[62208]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 511.122977] nova-compute[62208]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=62208) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:390}} [ 511.131718] nova-compute[62208]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.009s {{(pid=62208) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:428}} [ 511.205684] nova-compute[62208]: INFO nova.virt.driver [None req-a42142dd-4bfd-4ed9-8ca0-147117b9d495 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 511.243333] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 511.243533] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 511.243593] nova-compute[62208]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=62208) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 511.257812] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 511.257812] nova-compute[62208]: warnings.warn( [ 511.261813] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 511.261813] nova-compute[62208]: warnings.warn( [ 512.365173] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 512.365173] nova-compute[62208]: warnings.warn( [ 512.374784] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 512.374784] nova-compute[62208]: warnings.warn( [ 512.378248] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 512.378248] nova-compute[62208]: warnings.warn( [ 512.381240] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 512.381240] nova-compute[62208]: warnings.warn( [ 512.385392] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 512.385392] nova-compute[62208]: warnings.warn( [ 512.392771] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 512.392771] nova-compute[62208]: warnings.warn( [ 512.671320] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 512.671320] nova-compute[62208]: warnings.warn( [ 514.413462] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-34eb805b-0a50-4c0a-9259-5a89b3f5cbd1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.415596] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.415596] nova-compute[62208]: warnings.warn( [ 514.430169] nova-compute[62208]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=62208) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 514.430299] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-41608298-dd05-466a-b642-aeafa601cf1a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.431887] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.431887] nova-compute[62208]: warnings.warn( [ 514.464078] nova-compute[62208]: INFO oslo_vmware.api [-] Successfully established new session; session ID is fcca9. [ 514.464318] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.221s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 514.464940] nova-compute[62208]: INFO nova.virt.vmwareapi.driver [None req-a42142dd-4bfd-4ed9-8ca0-147117b9d495 None None] VMware vCenter version: 7.0.3 [ 514.468394] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c9a3edd-d061-4512-9230-fd3dad100d03 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.478327] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.478327] nova-compute[62208]: warnings.warn( [ 514.486584] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bb394df-23ad-4c66-962a-74da7af5b571 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.488987] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.488987] nova-compute[62208]: warnings.warn( [ 514.493063] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0ee39e8-1497-4dca-a6d2-22a975355e67 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.495339] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.495339] nova-compute[62208]: warnings.warn( [ 514.499961] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36909ff8-bb28-4f24-9735-ff83c74c2266 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.503404] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.503404] nova-compute[62208]: warnings.warn( [ 514.513424] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af15005c-551c-428d-8146-ac758290f012 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.515608] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.515608] nova-compute[62208]: warnings.warn( [ 514.519828] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8701ef95-2ec4-4bcc-8e9f-f751dee38d37 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.522704] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.522704] nova-compute[62208]: warnings.warn( [ 514.550383] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-6ef9be65-1393-4cc3-b756-fd64332ab448 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.551935] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.551935] nova-compute[62208]: warnings.warn( [ 514.556300] nova-compute[62208]: DEBUG nova.virt.vmwareapi.driver [None req-a42142dd-4bfd-4ed9-8ca0-147117b9d495 None None] Extension org.openstack.compute already exists. {{(pid=62208) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 514.559042] nova-compute[62208]: INFO nova.compute.provider_config [None req-a42142dd-4bfd-4ed9-8ca0-147117b9d495 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 514.570937] nova-compute[62208]: DEBUG nova.context [None req-a42142dd-4bfd-4ed9-8ca0-147117b9d495 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),da7c2cdb-d090-4c8e-97a8-c335ae01cae8(cell1) {{(pid=62208) load_cells /opt/stack/nova/nova/context.py:464}} [ 514.572814] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 514.573347] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 514.573690] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 514.574110] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Acquiring lock "da7c2cdb-d090-4c8e-97a8-c335ae01cae8" by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 514.574319] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Lock "da7c2cdb-d090-4c8e-97a8-c335ae01cae8" acquired by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 514.575258] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Lock "da7c2cdb-d090-4c8e-97a8-c335ae01cae8" "released" by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 514.606775] nova-compute[62208]: INFO dbcounter [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Registered counter for database nova_cell0 [ 514.615126] nova-compute[62208]: INFO dbcounter [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Registered counter for database nova_cell1 [ 514.621567] nova-compute[62208]: DEBUG oslo_db.sqlalchemy.engines [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION {{(pid=62208) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 514.623941] nova-compute[62208]: DEBUG oslo_db.sqlalchemy.engines [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION {{(pid=62208) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 514.625794] nova-compute[62208]: DEBUG dbcounter [-] [62208] Writer thread running {{(pid=62208) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 514.628679] nova-compute[62208]: ERROR nova.db.main.api [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 264, in main [ 514.628679] nova-compute[62208]: result = function(*args, **kwargs) [ 514.628679] nova-compute[62208]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 514.628679] nova-compute[62208]: return func(*args, **kwargs) [ 514.628679] nova-compute[62208]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 514.628679] nova-compute[62208]: result = fn(*args, **kwargs) [ 514.628679] nova-compute[62208]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 514.628679] nova-compute[62208]: return f(*args, **kwargs) [ 514.628679] nova-compute[62208]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 514.628679] nova-compute[62208]: return db.service_get_minimum_version(context, binaries) [ 514.628679] nova-compute[62208]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 514.628679] nova-compute[62208]: _check_db_access() [ 514.628679] nova-compute[62208]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 514.628679] nova-compute[62208]: stacktrace = ''.join(traceback.format_stack()) [ 514.628679] nova-compute[62208]: [ 514.629956] nova-compute[62208]: DEBUG dbcounter [-] [62208] Writer thread running {{(pid=62208) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 514.631219] nova-compute[62208]: ERROR nova.db.main.api [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 264, in main [ 514.631219] nova-compute[62208]: result = function(*args, **kwargs) [ 514.631219] nova-compute[62208]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 514.631219] nova-compute[62208]: return func(*args, **kwargs) [ 514.631219] nova-compute[62208]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 514.631219] nova-compute[62208]: result = fn(*args, **kwargs) [ 514.631219] nova-compute[62208]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 514.631219] nova-compute[62208]: return f(*args, **kwargs) [ 514.631219] nova-compute[62208]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 514.631219] nova-compute[62208]: return db.service_get_minimum_version(context, binaries) [ 514.631219] nova-compute[62208]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 514.631219] nova-compute[62208]: _check_db_access() [ 514.631219] nova-compute[62208]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 514.631219] nova-compute[62208]: stacktrace = ''.join(traceback.format_stack()) [ 514.631219] nova-compute[62208]: [ 514.631940] nova-compute[62208]: WARNING nova.objects.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Failed to get minimum service version for cell da7c2cdb-d090-4c8e-97a8-c335ae01cae8 [ 514.631940] nova-compute[62208]: WARNING nova.objects.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 514.632205] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Acquiring lock "singleton_lock" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 514.632370] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Acquired lock "singleton_lock" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 514.632618] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Releasing lock "singleton_lock" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 514.632982] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Full set of CONF: {{(pid=62208) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:363}} [ 514.633133] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ******************************************************************************** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 514.633346] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] Configuration options gathered from: {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 514.633434] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 514.633581] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 514.633713] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ================================================================================ {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 514.633921] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] allow_resize_to_same_host = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.634091] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] arq_binding_timeout = 300 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.634222] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] backdoor_port = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.634348] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] backdoor_socket = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.634515] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] block_device_allocate_retries = 60 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.634680] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] block_device_allocate_retries_interval = 3 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.634849] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cert = self.pem {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.635013] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.635184] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute_monitors = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.635366] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] config_dir = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.635554] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] config_drive_format = iso9660 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.635693] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.635858] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] config_source = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.636039] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] console_host = devstack {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.636213] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] control_exchange = nova {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.636373] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cpu_allocation_ratio = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.636535] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] daemon = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.636703] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] debug = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.636861] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] default_access_ip_network_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.637026] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] default_availability_zone = nova {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.637185] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] default_ephemeral_format = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.637346] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] default_green_pool_size = 1000 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.637623] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.637792] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] default_schedule_zone = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.637954] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] disk_allocation_ratio = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.638118] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] enable_new_services = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.638295] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] enabled_apis = ['osapi_compute'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.638464] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] enabled_ssl_apis = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.638627] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] flat_injected = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.638785] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] force_config_drive = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.638941] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] force_raw_images = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.639109] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] graceful_shutdown_timeout = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.639270] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] heal_instance_info_cache_interval = 60 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.639514] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] host = cpu-1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.639717] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] initial_cpu_allocation_ratio = 4.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.639911] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] initial_disk_allocation_ratio = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.640103] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] initial_ram_allocation_ratio = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.640327] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.640494] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] instance_build_timeout = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.640658] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] instance_delete_interval = 300 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.640825] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] instance_format = [instance: %(uuid)s] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.640995] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] instance_name_template = instance-%08x {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.641156] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] instance_usage_audit = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.641323] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] instance_usage_audit_period = month {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.641488] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.641655] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] instances_path = /opt/stack/data/nova/instances {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.641824] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] internal_service_availability_zone = internal {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.641980] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] key = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.642140] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] live_migration_retry_count = 30 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.642303] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] log_config_append = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.642469] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.642629] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] log_dir = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.642785] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] log_file = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.642912] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] log_options = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.643072] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] log_rotate_interval = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.643236] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] log_rotate_interval_type = days {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.643481] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] log_rotation_type = none {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.643563] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.643654] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.643826] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.643990] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.644131] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.644299] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] long_rpc_timeout = 1800 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.644463] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] max_concurrent_builds = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.644625] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] max_concurrent_live_migrations = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.644784] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] max_concurrent_snapshots = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.644943] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] max_local_block_devices = 3 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.645105] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] max_logfile_count = 30 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.645267] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] max_logfile_size_mb = 200 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.645473] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] maximum_instance_delete_attempts = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.645669] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] metadata_listen = 0.0.0.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.645847] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] metadata_listen_port = 8775 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.646022] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] metadata_workers = 2 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.646188] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] migrate_max_retries = -1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.646359] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] mkisofs_cmd = genisoimage {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.646568] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] my_block_storage_ip = 10.180.1.21 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.646704] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] my_ip = 10.180.1.21 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.646869] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] network_allocate_retries = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.647050] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.647220] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] osapi_compute_listen = 0.0.0.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.647385] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] osapi_compute_listen_port = 8774 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.647594] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] osapi_compute_unique_server_name_scope = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.647769] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] osapi_compute_workers = 2 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.647934] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] password_length = 12 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.648130] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] periodic_enable = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.648297] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] periodic_fuzzy_delay = 60 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.648471] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] pointer_model = usbtablet {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.648643] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] preallocate_images = none {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.648806] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] publish_errors = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.648939] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] pybasedir = /opt/stack/nova {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.649096] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ram_allocation_ratio = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.649262] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] rate_limit_burst = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.649429] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] rate_limit_except_level = CRITICAL {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.649591] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] rate_limit_interval = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.649757] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] reboot_timeout = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.649921] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] reclaim_instance_interval = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.650079] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] record = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.650248] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] reimage_timeout_per_gb = 60 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.650414] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] report_interval = 120 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.650576] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] rescue_timeout = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.650738] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] reserved_host_cpus = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.650899] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] reserved_host_disk_mb = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.651057] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] reserved_host_memory_mb = 512 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.651216] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] reserved_huge_pages = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.651375] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] resize_confirm_window = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.651544] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] resize_fs_using_block_device = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.651739] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] resume_guests_state_on_host_boot = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.651916] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.652094] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] rpc_response_timeout = 60 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.652262] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] run_external_periodic_tasks = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.652433] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] running_deleted_instance_action = reap {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.652594] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] running_deleted_instance_poll_interval = 1800 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.652757] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] running_deleted_instance_timeout = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.652918] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] scheduler_instance_sync_interval = 120 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.653086] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] service_down_time = 720 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.653256] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] servicegroup_driver = db {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.653417] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] shelved_offload_time = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.653621] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] shelved_poll_interval = 3600 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.653738] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] shutdown_timeout = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.653904] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] source_is_ipv6 = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.654061] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ssl_only = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.654320] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.654488] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] sync_power_state_interval = 600 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.654651] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] sync_power_state_pool_size = 1000 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.654822] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] syslog_log_facility = LOG_USER {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.654980] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] tempdir = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.655141] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] timeout_nbd = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.655320] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] transport_url = **** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.655515] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] update_resources_interval = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.655695] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] use_cow_images = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.655861] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] use_eventlog = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.656039] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] use_journal = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.656196] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] use_json = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.656356] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] use_rootwrap_daemon = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.656517] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] use_stderr = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.656677] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] use_syslog = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.656834] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vcpu_pin_set = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.657002] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plugging_is_fatal = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.657168] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plugging_timeout = 300 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.657334] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] virt_mkfs = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.657531] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] volume_usage_poll_interval = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.657701] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] watch_log_file = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.657872] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] web = /usr/share/spice-html5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.658064] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_concurrency.disable_process_locking = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.658378] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.658563] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.658735] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.658910] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_metrics.metrics_process_name = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.659119] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.659291] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.659478] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.auth_strategy = keystone {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.659657] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.compute_link_prefix = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.659857] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.660055] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.dhcp_domain = novalocal {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.660229] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.enable_instance_password = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.660395] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.glance_link_prefix = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.660562] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.660736] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.instance_list_cells_batch_strategy = distributed {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.660899] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.instance_list_per_project_cells = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.661061] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.list_records_by_skipping_down_cells = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.661223] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.local_metadata_per_cell = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.661392] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.max_limit = 1000 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.661562] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.metadata_cache_expiration = 15 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.661741] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.neutron_default_tenant_id = default {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.661908] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.use_neutron_default_nets = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.662080] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.662244] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.vendordata_dynamic_failure_fatal = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.662411] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.662584] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.vendordata_dynamic_ssl_certfile = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.662759] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.vendordata_dynamic_targets = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.662933] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.vendordata_jsonfile_path = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.663114] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api.vendordata_providers = ['StaticJSON'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.663305] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.backend = dogpile.cache.memcached {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.663502] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.backend_argument = **** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.663644] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.config_prefix = cache.oslo {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.663816] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.dead_timeout = 60.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.663979] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.debug_cache_backend = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.664155] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.enable_retry_client = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.664317] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.enable_socket_keepalive = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.664487] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.enabled = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.664654] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.expiration_time = 600 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.664815] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.hashclient_retry_attempts = 2 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.664981] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.hashclient_retry_delay = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.665147] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.memcache_dead_retry = 300 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.665319] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.memcache_password = **** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.665507] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.665688] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.665857] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.memcache_pool_maxsize = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.666025] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.memcache_pool_unused_timeout = 60 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.666189] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.memcache_sasl_enabled = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.666370] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.memcache_servers = ['localhost:11211'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.666538] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.memcache_socket_timeout = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.666713] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.memcache_username = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.666880] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.proxies = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.667048] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.retry_attempts = 2 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.667215] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.retry_delay = 0.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.667380] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.socket_keepalive_count = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.667579] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.socket_keepalive_idle = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.667743] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.socket_keepalive_interval = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.667902] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.tls_allowed_ciphers = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.668072] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.tls_cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.668234] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.tls_certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.668398] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.tls_enabled = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.668558] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cache.tls_keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.668731] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.auth_section = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.668905] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.auth_type = password {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.669068] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.669246] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.catalog_info = volumev3::publicURL {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.669408] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.669576] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.collect_timing = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.669747] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.cross_az_attach = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.669937] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.debug = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.670105] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.endpoint_template = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.670338] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.http_retries = 3 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.670437] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.insecure = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.670599] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.670770] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.os_region_name = RegionOne {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.670933] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.split_loggers = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.671095] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cinder.timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.671269] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.671430] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute.cpu_dedicated_set = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.671590] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute.cpu_shared_set = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.671757] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute.image_type_exclude_list = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.671920] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute.live_migration_wait_for_vif_plug = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.672095] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute.max_concurrent_disk_ops = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.672261] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute.max_disk_devices_to_attach = -1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.672425] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.672594] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.672761] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute.resource_provider_association_refresh = 300 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.672923] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute.shutdown_retry_interval = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.673103] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.673283] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] conductor.workers = 2 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.673459] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] console.allowed_origins = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.673623] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] console.ssl_ciphers = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.673860] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] console.ssl_minimum_version = default {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.673969] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] consoleauth.token_ttl = 600 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.674150] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.674310] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.674478] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.collect_timing = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.674641] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.connect_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.674801] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.connect_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.674960] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.endpoint_override = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.675122] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.insecure = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.675281] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.675449] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.max_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.675634] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.min_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.675797] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.region_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.675958] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.retriable_status_codes = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.676131] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.service_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.676302] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.service_type = accelerator {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.676463] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.split_loggers = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.676622] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.status_code_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.676784] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.status_code_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.676943] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.677124] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.677285] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] cyborg.version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.677669] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.backend = sqlalchemy {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.677669] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.connection = **** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.677848] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.connection_debug = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.678008] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.connection_parameters = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.678177] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.connection_recycle_time = 3600 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.678351] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.connection_trace = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.678523] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.db_inc_retry_interval = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.678671] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.db_max_retries = 20 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.678836] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.db_max_retry_interval = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.678999] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.db_retry_interval = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.679169] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.max_overflow = 50 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.679335] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.max_pool_size = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.679506] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.max_retries = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.679678] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.mysql_sql_mode = TRADITIONAL {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.679863] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.mysql_wsrep_sync_wait = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.680049] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.pool_timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.680230] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.retry_interval = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.680413] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.slave_connection = **** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.680561] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.sqlite_synchronous = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.680728] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] database.use_db_reconnect = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.680918] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.backend = sqlalchemy {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.681096] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.connection = **** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.681269] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.connection_debug = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.681440] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.connection_parameters = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.681753] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.connection_recycle_time = 3600 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.681812] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.connection_trace = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.681932] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.db_inc_retry_interval = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.682085] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.db_max_retries = 20 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.682251] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.db_max_retry_interval = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.682413] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.db_retry_interval = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.682586] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.max_overflow = 50 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.682755] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.max_pool_size = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.682925] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.max_retries = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.683098] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.683263] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.mysql_wsrep_sync_wait = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.683427] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.pool_timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.683602] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.retry_interval = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.683767] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.slave_connection = **** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.683989] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] api_database.sqlite_synchronous = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.684119] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] devices.enabled_mdev_types = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.684298] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.684462] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ephemeral_storage_encryption.enabled = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.684628] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ephemeral_storage_encryption.key_size = 512 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.685092] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.api_servers = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.685092] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.685217] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.685276] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.collect_timing = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.685417] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.connect_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.685608] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.connect_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.685781] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.debug = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.685950] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.default_trusted_certificate_ids = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.686112] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.enable_certificate_validation = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.686274] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.enable_rbd_download = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.686434] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.endpoint_override = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.686649] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.insecure = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.686766] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.686927] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.max_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.687088] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.min_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.687258] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.num_retries = 3 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.687430] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.rbd_ceph_conf = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.687624] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.rbd_connect_timeout = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.687825] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.rbd_pool = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.688056] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.rbd_user = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.688309] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.region_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.688495] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.retriable_status_codes = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.688664] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.service_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.688841] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.service_type = image {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.689009] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.split_loggers = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.689171] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.status_code_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.689334] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.status_code_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.689496] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.689685] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.689855] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.verify_glance_signatures = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.690018] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] glance.version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.690229] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] guestfs.debug = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.690420] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.config_drive_cdrom = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.690618] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.config_drive_inject_password = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.690797] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.690965] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.enable_instance_metrics_collection = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.691130] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.enable_remotefx = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.691306] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.instances_path_share = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.691477] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.iscsi_initiator_list = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.691643] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.limit_cpu_features = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.691808] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.691971] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.692172] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.power_state_check_timeframe = 60 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.692338] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.power_state_event_polling_interval = 2 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.692511] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.692678] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.use_multipath_io = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.692842] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.volume_attach_retry_count = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.693003] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.volume_attach_retry_interval = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.693164] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.vswitch_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.693328] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.693499] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] mks.enabled = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.693873] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.694115] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] image_cache.manager_interval = 2400 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.694235] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] image_cache.precache_concurrency = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.694407] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] image_cache.remove_unused_base_images = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.694581] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.694752] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.694932] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] image_cache.subdirectory_name = _base {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.695109] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.api_max_retries = 60 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.695276] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.api_retry_interval = 2 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.695459] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.auth_section = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.695634] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.auth_type = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.695802] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.695965] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.696143] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.collect_timing = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.696311] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.conductor_group = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.696475] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.connect_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.696639] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.connect_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.696801] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.endpoint_override = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.696965] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.insecure = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.697126] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.697289] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.max_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.697485] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.min_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.697648] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.peer_list = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.697813] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.region_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.697975] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.retriable_status_codes = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.698141] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.serial_console_state_timeout = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.698302] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.service_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.698476] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.service_type = baremetal {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.698642] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.split_loggers = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.698803] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.status_code_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.698962] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.status_code_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.699120] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.699300] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.699463] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ironic.version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.699648] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.699839] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] key_manager.fixed_key = **** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.700041] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.700212] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.barbican_api_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.700374] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.barbican_endpoint = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.700571] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.barbican_endpoint_type = public {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.700706] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.barbican_region_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.700863] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.701024] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.701211] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.collect_timing = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.701353] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.insecure = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.701514] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.701682] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.number_of_retries = 60 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.701843] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.retry_delay = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.702009] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.send_service_user_token = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.702172] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.split_loggers = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.702332] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.702494] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.verify_ssl = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.702654] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican.verify_ssl_path = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.702820] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican_service_user.auth_section = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.702982] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican_service_user.auth_type = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.703141] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican_service_user.cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.703301] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican_service_user.certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.703465] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican_service_user.collect_timing = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.703630] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican_service_user.insecure = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.703790] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican_service_user.keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.703955] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican_service_user.split_loggers = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.704127] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] barbican_service_user.timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.704297] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.approle_role_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.704460] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.approle_secret_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.704622] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.704781] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.704947] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.collect_timing = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.705111] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.insecure = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.705270] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.705448] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.kv_mountpoint = secret {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.705635] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.kv_path = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.705804] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.kv_version = 2 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.705965] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.namespace = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.706125] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.root_token_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.706290] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.split_loggers = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.706451] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.ssl_ca_crt_file = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.706613] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.706776] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.use_ssl = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.706947] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.707117] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.auth_section = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.707283] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.auth_type = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.707448] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.707629] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.707799] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.collect_timing = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.707962] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.connect_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.708136] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.connect_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.708290] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.endpoint_override = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.708456] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.insecure = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.708615] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.708775] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.max_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.708932] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.min_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.709090] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.region_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.709248] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.retriable_status_codes = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.709404] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.service_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.709576] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.service_type = identity {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.709742] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.split_loggers = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.709926] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.status_code_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.710407] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.status_code_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.710486] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.710666] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.710837] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] keystone.version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.711042] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.connection_uri = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.711209] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.cpu_mode = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.711382] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.cpu_model_extra_flags = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.711554] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.cpu_models = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.711732] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.cpu_power_governor_high = performance {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.711902] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.cpu_power_governor_low = powersave {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.712083] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.cpu_power_management = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.712263] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.712432] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.device_detach_attempts = 8 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.712901] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.device_detach_timeout = 20 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.712901] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.disk_cachemodes = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.712901] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.disk_prefix = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.713080] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.enabled_perf_events = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.713226] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.file_backed_memory = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.713395] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.gid_maps = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.713556] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.hw_disk_discard = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.713711] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.hw_machine_type = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.713885] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.images_rbd_ceph_conf = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.714046] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.714253] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.714540] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.images_rbd_glance_store_name = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.714600] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.images_rbd_pool = rbd {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.714766] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.images_type = default {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.714929] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.images_volume_group = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.715092] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.inject_key = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.715256] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.inject_partition = -2 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.715416] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.inject_password = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.715609] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.iscsi_iface = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.715778] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.iser_use_multipath = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.715942] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.live_migration_bandwidth = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.716188] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.live_migration_completion_timeout = 800 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.716425] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.live_migration_downtime = 500 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.716535] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.live_migration_downtime_delay = 75 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.716651] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.live_migration_downtime_steps = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.716766] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.live_migration_inbound_addr = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.716994] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.live_migration_permit_auto_converge = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.717089] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.live_migration_permit_post_copy = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.717249] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.live_migration_scheme = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.717421] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.live_migration_timeout_action = abort {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.717612] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.live_migration_tunnelled = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.717779] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.live_migration_uri = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.717947] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.live_migration_with_native_tls = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.718107] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.max_queues = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.718271] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.mem_stats_period_seconds = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.718533] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.migration_inbound_addr = 10.180.1.21 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.718709] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.nfs_mount_options = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.719014] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.719190] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.num_aoe_discover_tries = 3 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.719358] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.num_iser_scan_tries = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.719520] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.num_memory_encrypted_guests = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.719688] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.num_nvme_discover_tries = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.719875] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.num_pcie_ports = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.720062] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.num_volume_scan_tries = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.720233] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.pmem_namespaces = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.720394] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.quobyte_client_cfg = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.720687] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.720863] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.rbd_connect_timeout = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.721029] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.721194] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.721359] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.rbd_secret_uuid = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.721521] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.rbd_user = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.721690] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.realtime_scheduler_priority = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.721862] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.remote_filesystem_transport = ssh {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.722026] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.rescue_image_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.722186] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.rescue_kernel_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.722346] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.rescue_ramdisk_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.722519] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.rng_dev_path = /dev/urandom {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.722926] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.rx_queue_size = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.722926] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.smbfs_mount_options = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.723389] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.723389] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.snapshot_compression = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.723501] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.snapshot_image_format = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.723627] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.723797] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.sparse_logical_volumes = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.723962] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.swtpm_enabled = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.724146] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.swtpm_group = tss {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.724315] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.swtpm_user = tss {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.724487] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.sysinfo_serial = unique {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.724648] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.tb_cache_size = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.724807] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.tx_queue_size = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.724970] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.uid_maps = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.725136] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.use_virtio_for_bridges = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.725308] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.virt_type = kvm {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.725502] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.volume_clear = zero {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.725684] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.volume_clear_size = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.725853] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.volume_use_multipath = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.726015] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.vzstorage_cache_path = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.726188] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.726358] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.vzstorage_mount_group = qemu {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.726526] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.vzstorage_mount_opts = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.726700] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.726974] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.727151] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.vzstorage_mount_user = stack {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.727320] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.727518] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.auth_section = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.727705] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.auth_type = password {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.727868] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.728040] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.728210] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.collect_timing = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.728370] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.connect_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.728559] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.connect_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.728740] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.default_floating_pool = public {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.728900] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.endpoint_override = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.729063] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.extension_sync_interval = 600 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.729226] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.http_retries = 3 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.729387] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.insecure = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.729549] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.729712] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.max_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.729883] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.metadata_proxy_shared_secret = **** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.730044] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.min_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.730216] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.ovs_bridge = br-int {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.730382] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.physnets = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.730553] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.region_name = RegionOne {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.730720] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.retriable_status_codes = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.730887] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.service_metadata_proxy = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.731043] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.service_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.731212] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.service_type = network {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.731372] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.split_loggers = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.731534] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.status_code_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.731711] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.status_code_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.731889] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.732097] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.732275] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] neutron.version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.732452] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] notifications.bdms_in_notifications = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.732637] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] notifications.default_level = INFO {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.732822] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] notifications.notification_format = unversioned {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.733001] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] notifications.notify_on_state_change = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.733180] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.733360] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] pci.alias = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.733531] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] pci.device_spec = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.733699] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] pci.report_in_placement = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.733876] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.auth_section = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.734051] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.auth_type = password {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.734221] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.auth_url = http://10.180.1.21/identity {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.734383] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.734601] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.734706] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.collect_timing = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.734869] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.connect_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.735033] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.connect_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.735196] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.default_domain_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.735356] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.default_domain_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.735541] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.domain_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.735714] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.domain_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.735875] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.endpoint_override = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.736052] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.insecure = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.736217] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.736377] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.max_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.736537] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.min_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.736709] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.password = **** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.736869] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.project_domain_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.737039] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.project_domain_name = Default {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.737207] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.project_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.737380] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.project_name = service {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.737581] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.region_name = RegionOne {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.737755] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.retriable_status_codes = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.737920] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.service_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.738095] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.service_type = placement {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.738262] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.split_loggers = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.738428] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.status_code_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.738609] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.status_code_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.738778] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.system_scope = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.738940] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.739102] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.trust_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.739262] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.user_domain_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.739431] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.user_domain_name = Default {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.739593] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.user_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.739770] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.username = placement {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.739953] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.740130] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] placement.version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.740312] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] quota.cores = 20 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.740480] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] quota.count_usage_from_placement = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.740656] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.740854] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] quota.injected_file_content_bytes = 10240 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.741007] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] quota.injected_file_path_length = 255 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.741177] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] quota.injected_files = 5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.741347] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] quota.instances = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.741516] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] quota.key_pairs = 100 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.741684] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] quota.metadata_items = 128 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.741851] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] quota.ram = 51200 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.742014] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] quota.recheck_quota = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.742183] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] quota.server_group_members = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.742351] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] quota.server_groups = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.742522] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] rdp.enabled = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.742844] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.743028] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.743198] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.743363] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] scheduler.image_metadata_prefilter = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.743527] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.743696] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] scheduler.max_attempts = 3 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.743866] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] scheduler.max_placement_results = 1000 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.744139] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.744218] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] scheduler.query_placement_for_image_type_support = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.744367] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.744542] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] scheduler.workers = 2 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.744725] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.744895] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.745078] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.745252] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.745421] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.745644] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.745848] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.746043] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.746215] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.host_subset_size = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.746387] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.746551] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.image_properties_default_architecture = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.746723] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.746898] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.isolated_hosts = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.747066] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.isolated_images = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.747233] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.max_instances_per_host = 50 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.747398] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.747597] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.747768] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.pci_in_placement = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.747937] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.748116] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.748283] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.748457] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.748652] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.748823] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.748988] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.track_instance_changes = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.749165] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.749337] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] metrics.required = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.749512] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] metrics.weight_multiplier = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.749683] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] metrics.weight_of_unavailable = -10000.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.749852] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] metrics.weight_setting = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.750157] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.750333] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] serial_console.enabled = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.750513] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] serial_console.port_range = 10000:20000 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.750694] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.750867] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.751036] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] serial_console.serialproxy_port = 6083 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.751204] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] service_user.auth_section = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.751414] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] service_user.auth_type = password {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.751579] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] service_user.cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.751742] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] service_user.certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.751905] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] service_user.collect_timing = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.752077] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] service_user.insecure = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.752240] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] service_user.keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.752408] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] service_user.send_service_user_token = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.752574] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] service_user.split_loggers = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.752735] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] service_user.timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.752905] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] spice.agent_enabled = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.753068] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] spice.enabled = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.753362] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.753563] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] spice.html5proxy_host = 0.0.0.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.753741] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] spice.html5proxy_port = 6082 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.753909] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] spice.image_compression = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.754069] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] spice.jpeg_compression = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.754228] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] spice.playback_compression = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.754398] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] spice.server_listen = 127.0.0.1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.754568] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.754731] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] spice.streaming_mode = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.754889] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] spice.zlib_compression = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.755055] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] upgrade_levels.baseapi = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.755223] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] upgrade_levels.compute = auto {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.755386] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] upgrade_levels.conductor = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.755572] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] upgrade_levels.scheduler = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.755749] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vendordata_dynamic_auth.auth_section = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.755940] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vendordata_dynamic_auth.auth_type = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.756134] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vendordata_dynamic_auth.cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.756299] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vendordata_dynamic_auth.certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.756463] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vendordata_dynamic_auth.collect_timing = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.756648] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vendordata_dynamic_auth.insecure = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.756832] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vendordata_dynamic_auth.keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.756998] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vendordata_dynamic_auth.split_loggers = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.757158] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vendordata_dynamic_auth.timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.757335] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.api_retry_count = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.757539] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.ca_file = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.757723] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.cache_prefix = devstack-image-cache {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.757895] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.cluster_name = testcl1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.758061] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.connection_pool_size = 10 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.758223] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.console_delay_seconds = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.758390] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.datastore_regex = ^datastore.* {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.758601] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.758774] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.host_password = **** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.758943] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.host_port = 443 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.759146] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.host_username = administrator@vsphere.local {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.759322] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.insecure = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.759486] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.integration_bridge = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.759653] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.maximum_objects = 100 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.759816] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.pbm_default_policy = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.759978] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.pbm_enabled = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.760152] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.pbm_wsdl_location = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.760323] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.760485] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.serial_port_proxy_uri = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.760646] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.serial_port_service_uri = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.760815] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.task_poll_interval = 0.5 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.760994] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.use_linked_clone = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.761155] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.vnc_keymap = en-us {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.761402] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.vnc_port = 5900 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.761595] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vmware.vnc_port_total = 10000 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.761789] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vnc.auth_schemes = ['none'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.761960] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vnc.enabled = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.762253] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.762440] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.762618] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vnc.novncproxy_port = 6080 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.762797] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vnc.server_listen = 127.0.0.1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.763058] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.763236] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vnc.vencrypt_ca_certs = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.763400] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vnc.vencrypt_client_cert = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.763634] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vnc.vencrypt_client_key = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.763823] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.763990] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.disable_fallback_pcpu_query = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.764169] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.disable_group_policy_check_upcall = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.764335] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.764502] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.disable_rootwrap = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.764669] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.enable_numa_live_migration = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.764832] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.764997] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.765160] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.handle_virt_lifecycle_events = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.765326] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.libvirt_disable_apic = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.765500] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.never_download_image_if_on_rbd = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.765669] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.765834] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.765997] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.766161] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.766322] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.766484] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.766647] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.766809] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.766974] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.767156] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.767327] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] wsgi.client_socket_timeout = 900 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.767503] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] wsgi.default_pool_size = 1000 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.767675] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] wsgi.keep_alive = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.767844] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] wsgi.max_header_line = 16384 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.768057] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] wsgi.secure_proxy_ssl_header = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.768190] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] wsgi.ssl_ca_file = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.768354] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] wsgi.ssl_cert_file = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.768528] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] wsgi.ssl_key_file = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.768706] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] wsgi.tcp_keepidle = 600 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.768886] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.769056] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] zvm.ca_file = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.769219] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] zvm.cloud_connector_url = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.769510] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.769686] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] zvm.reachable_timeout = 300 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.769869] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_policy.enforce_new_defaults = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.770040] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_policy.enforce_scope = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.770217] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_policy.policy_default_rule = default {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.770400] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.770576] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_policy.policy_file = policy.yaml {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.770750] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.770915] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.771074] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.771228] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.771388] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.771553] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.771732] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.771909] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler.connection_string = messaging:// {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.772088] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler.enabled = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.772263] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler.es_doc_type = notification {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.772426] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler.es_scroll_size = 10000 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.772594] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler.es_scroll_time = 2m {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.772759] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler.filter_error_trace = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.772926] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler.hmac_keys = **** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.773092] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler.sentinel_service_name = mymaster {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.773256] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler.socket_timeout = 0.1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.773418] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler.trace_requests = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.773636] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler.trace_sqlalchemy = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.773831] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler_jaeger.process_tags = {} {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.774002] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler_jaeger.service_name_prefix = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.774175] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] profiler_otlp.service_name_prefix = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.774343] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] remote_debug.host = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.774506] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] remote_debug.port = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.774745] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.774918] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.775164] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.775336] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.775499] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.775662] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.775826] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.775990] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.heartbeat_rate = 3 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.776171] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.776333] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.776504] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.776710] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.776883] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.777049] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.777209] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.777381] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.777551] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.777716] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.777879] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.778044] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.778203] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.778367] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.778527] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.778690] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.778853] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.779016] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.ssl = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.779187] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.779353] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.779513] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.779682] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.779850] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_rabbit.ssl_version = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.780043] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.780215] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_notifications.retry = -1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.780397] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.780570] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_messaging_notifications.transport_url = **** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.780745] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.auth_section = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.780907] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.auth_type = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.781064] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.cafile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.781220] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.certfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.781379] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.collect_timing = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.781538] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.connect_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.781701] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.connect_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.781860] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.endpoint_id = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.782017] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.endpoint_override = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.782177] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.insecure = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.782336] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.keyfile = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.782493] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.max_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.782650] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.min_version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.782809] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.region_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.782966] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.retriable_status_codes = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.783124] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.service_name = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.783279] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.service_type = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.783441] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.split_loggers = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.783601] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.status_code_retries = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.783761] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.status_code_retry_delay = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.783918] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.timeout = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.784089] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.valid_interfaces = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.784249] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_limit.version = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.784413] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_reports.file_event_handler = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.784578] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_reports.file_event_handler_interval = 1 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.784740] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] oslo_reports.log_dir = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.784911] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.785132] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plug_linux_bridge_privileged.group = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.785227] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.785395] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.785561] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.785721] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plug_linux_bridge_privileged.user = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.785890] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.786049] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plug_ovs_privileged.group = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.786208] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plug_ovs_privileged.helper_command = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.786373] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.786536] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.786816] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] vif_plug_ovs_privileged.user = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.786891] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_linux_bridge.flat_interface = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.787037] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.787210] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.787380] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.787576] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.787761] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.787928] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.788104] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_linux_bridge.vlan_interface = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.788283] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.788512] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_ovs.isolate_vif = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.788709] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.788882] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.789055] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.789228] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_ovs.ovsdb_interface = native {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.789393] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_vif_ovs.per_port_bridge = False {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.789561] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] os_brick.lock_path = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.789735] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] privsep_osbrick.capabilities = [21] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.789928] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] privsep_osbrick.group = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.790095] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] privsep_osbrick.helper_command = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.790264] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.790430] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] privsep_osbrick.thread_pool_size = 8 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.790589] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] privsep_osbrick.user = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.790840] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.791016] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] nova_sys_admin.group = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.791201] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] nova_sys_admin.helper_command = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.791351] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.791517] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] nova_sys_admin.thread_pool_size = 8 {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.791678] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] nova_sys_admin.user = None {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.791809] nova-compute[62208]: DEBUG oslo_service.service [None req-2cdb21e8-bb3c-4557-917d-82c1f8a1949e None None] ******************************************************************************** {{(pid=62208) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 514.792312] nova-compute[62208]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 514.801085] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Getting list of instances from cluster (obj){ [ 514.801085] nova-compute[62208]: value = "domain-c8" [ 514.801085] nova-compute[62208]: _type = "ClusterComputeResource" [ 514.801085] nova-compute[62208]: } {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 514.802418] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ca10e19-4380-42fc-ae5c-19b6ccf78b31 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.806106] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.806106] nova-compute[62208]: warnings.warn( [ 514.812433] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Got total of 0 instances {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 514.813017] nova-compute[62208]: WARNING nova.virt.vmwareapi.driver [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 514.813521] nova-compute[62208]: INFO nova.virt.node [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Generated node identity 8d308854-9c5b-48ef-bafe-5c6c728e46d8 [ 514.813768] nova-compute[62208]: INFO nova.virt.node [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Wrote node identity 8d308854-9c5b-48ef-bafe-5c6c728e46d8 to /opt/stack/data/n-cpu-1/compute_id [ 514.825776] nova-compute[62208]: WARNING nova.compute.manager [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Compute nodes ['8d308854-9c5b-48ef-bafe-5c6c728e46d8'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 514.862268] nova-compute[62208]: INFO nova.compute.manager [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 514.888392] nova-compute[62208]: WARNING nova.compute.manager [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 514.888671] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 514.888837] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 514.888987] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 514.889139] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 514.890289] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-187c8c6e-1685-4dee-b7e7-fdb6271852d4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.893053] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.893053] nova-compute[62208]: warnings.warn( [ 514.898943] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee59f42c-ed40-4d64-9cc8-21527fe63b9e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.902935] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.902935] nova-compute[62208]: warnings.warn( [ 514.913698] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04d2ba5e-1ed3-4d7f-ab63-d663d6e48475 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.916066] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.916066] nova-compute[62208]: warnings.warn( [ 514.920663] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ec02207-25bb-4211-8d99-6b87388a8edd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.923646] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 514.923646] nova-compute[62208]: warnings.warn( [ 514.950759] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181985MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 514.950961] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 514.951104] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 514.964579] nova-compute[62208]: WARNING nova.compute.resource_tracker [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] No compute node record for cpu-1:8d308854-9c5b-48ef-bafe-5c6c728e46d8: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 8d308854-9c5b-48ef-bafe-5c6c728e46d8 could not be found. [ 514.980992] nova-compute[62208]: INFO nova.compute.resource_tracker [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 [ 515.033964] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 515.034246] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 515.132746] nova-compute[62208]: INFO nova.scheduler.client.report [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] [req-0453c308-74bd-4c9f-aa03-01cc2bb1da1d] Created resource provider record via placement API for resource provider with UUID 8d308854-9c5b-48ef-bafe-5c6c728e46d8 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 515.149170] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c36fd719-4b24-4849-a071-145e40d6e68a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 515.152334] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 515.152334] nova-compute[62208]: warnings.warn( [ 515.157916] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91ae79ae-5ac6-4a8c-bd98-ea08a6e4aeaa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 515.161092] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 515.161092] nova-compute[62208]: warnings.warn( [ 515.187755] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dda84ed8-b896-4838-b52c-b62f34df1bc3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 515.190503] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 515.190503] nova-compute[62208]: warnings.warn( [ 515.195982] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de5c66ca-cbae-4ceb-9819-25c6ee164104 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 515.199973] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 515.199973] nova-compute[62208]: warnings.warn( [ 515.209954] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Updating inventory in ProviderTree for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 515.251028] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Updated inventory for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 515.251507] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Updating resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 generation from 0 to 1 during operation: update_inventory {{(pid=62208) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 515.251789] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Updating inventory in ProviderTree for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 515.299632] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Updating resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 generation from 1 to 2 during operation: update_traits {{(pid=62208) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 515.319389] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 515.319824] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.369s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 515.320128] nova-compute[62208]: DEBUG nova.service [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Creating RPC server for service compute {{(pid=62208) start /opt/stack/nova/nova/service.py:182}} [ 515.335967] nova-compute[62208]: DEBUG nova.service [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] Join ServiceGroup membership for this service compute {{(pid=62208) start /opt/stack/nova/nova/service.py:199}} [ 515.336441] nova-compute[62208]: DEBUG nova.servicegroup.drivers.db [None req-d5134686-beb1-4d5d-9c5f-1686f090885d None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = <Service: host=cpu-1, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> {{(pid=62208) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 524.627772] nova-compute[62208]: DEBUG dbcounter [-] [62208] Writing DB stats nova_cell1:SELECT=1 {{(pid=62208) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 524.630909] nova-compute[62208]: DEBUG dbcounter [-] [62208] Writing DB stats nova_cell0:SELECT=1 {{(pid=62208) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 564.452959] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Acquiring lock "53e0e94e-e81c-44b0-bb52-18759172d614" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 564.453360] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Lock "53e0e94e-e81c-44b0-bb52-18759172d614" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 564.477988] nova-compute[62208]: DEBUG nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 564.595515] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 564.595792] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 564.597433] nova-compute[62208]: INFO nova.compute.claims [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 564.725623] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a36979b-fbb8-4be6-af48-fb1392b2781c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.728253] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 564.728253] nova-compute[62208]: warnings.warn( [ 564.733980] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87f316f5-ce50-4570-a0a6-107a3893c173 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.738238] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 564.738238] nova-compute[62208]: warnings.warn( [ 564.767501] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce35fef5-c456-4a48-94fe-76a9e3a118b0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.770131] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 564.770131] nova-compute[62208]: warnings.warn( [ 564.775733] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4713b48b-ec56-45f4-915e-3f22f44adebc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.779756] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 564.779756] nova-compute[62208]: warnings.warn( [ 564.790329] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 564.800314] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 564.822483] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 564.823051] nova-compute[62208]: DEBUG nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 564.866327] nova-compute[62208]: DEBUG nova.compute.utils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 564.868094] nova-compute[62208]: DEBUG nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Not allocating networking since 'none' was specified. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1968}} [ 564.884950] nova-compute[62208]: DEBUG nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 564.967146] nova-compute[62208]: DEBUG nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 565.996448] nova-compute[62208]: DEBUG nova.virt.hardware [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 565.996731] nova-compute[62208]: DEBUG nova.virt.hardware [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 565.996860] nova-compute[62208]: DEBUG nova.virt.hardware [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 565.997034] nova-compute[62208]: DEBUG nova.virt.hardware [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 565.997175] nova-compute[62208]: DEBUG nova.virt.hardware [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 565.997318] nova-compute[62208]: DEBUG nova.virt.hardware [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 565.997580] nova-compute[62208]: DEBUG nova.virt.hardware [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 565.997756] nova-compute[62208]: DEBUG nova.virt.hardware [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 565.998147] nova-compute[62208]: DEBUG nova.virt.hardware [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 565.998314] nova-compute[62208]: DEBUG nova.virt.hardware [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 565.998498] nova-compute[62208]: DEBUG nova.virt.hardware [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 565.999785] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7f13ecf-9afa-4667-be49-06e18577d7de {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.002994] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 566.002994] nova-compute[62208]: warnings.warn( [ 566.008942] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f87df30-f4d7-49a4-a095-867d7cf792c1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.012594] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 566.012594] nova-compute[62208]: warnings.warn( [ 566.025894] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c32b2b9-d15d-4eb7-9da4-163ad56c7ae1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.035830] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 566.035830] nova-compute[62208]: warnings.warn( [ 566.045914] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 566.055417] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 566.055782] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6da91b1b-e386-4cef-95e8-034880be582c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.057563] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 566.057563] nova-compute[62208]: warnings.warn( [ 566.075019] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Created folder: OpenStack in parent group-v4. [ 566.075131] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Creating folder: Project (af82cc76e7814b359c439fa41d827bb6). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 566.075324] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1d22ade4-655e-4642-a141-cfe46fe5244d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.077186] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 566.077186] nova-compute[62208]: warnings.warn( [ 566.087630] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Created folder: Project (af82cc76e7814b359c439fa41d827bb6) in parent group-v17427. [ 566.087812] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Creating folder: Instances. Parent ref: group-v17428. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 566.088223] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5ba1232f-80e0-41a5-9f13-8f567e4719c6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.089693] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 566.089693] nova-compute[62208]: warnings.warn( [ 566.098548] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Created folder: Instances in parent group-v17428. [ 566.098821] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 566.099018] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 566.099215] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f849434e-3e05-4eae-be38-51dfa33a2418 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.110555] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 566.110555] nova-compute[62208]: warnings.warn( [ 566.116397] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 566.116397] nova-compute[62208]: value = "task-38360" [ 566.116397] nova-compute[62208]: _type = "Task" [ 566.116397] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 566.121070] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 566.121070] nova-compute[62208]: warnings.warn( [ 566.127064] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38360, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 566.620640] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 566.620640] nova-compute[62208]: warnings.warn( [ 566.626299] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38360, 'name': CreateVM_Task, 'duration_secs': 0.260146} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 566.626472] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 566.626903] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 566.627122] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 566.630117] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-833020de-ad62-4124-801c-5f67b3e24c66 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.640163] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 566.640163] nova-compute[62208]: warnings.warn( [ 566.661395] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Reconfiguring VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 566.661395] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-4683fa8a-a100-477b-9362-795330ff7a02 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.667701] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 566.667701] nova-compute[62208]: warnings.warn( [ 566.674749] nova-compute[62208]: DEBUG oslo_vmware.api [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Waiting for the task: (returnval){ [ 566.674749] nova-compute[62208]: value = "task-38361" [ 566.674749] nova-compute[62208]: _type = "Task" [ 566.674749] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 566.678395] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 566.678395] nova-compute[62208]: warnings.warn( [ 566.686385] nova-compute[62208]: DEBUG oslo_vmware.api [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Task: {'id': task-38361, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 567.178811] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 567.178811] nova-compute[62208]: warnings.warn( [ 567.184972] nova-compute[62208]: DEBUG oslo_vmware.api [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Task: {'id': task-38361, 'name': ReconfigVM_Task, 'duration_secs': 0.11014} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 567.185285] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Reconfigured VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 567.185663] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.558s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 567.186387] nova-compute[62208]: DEBUG oslo_vmware.service [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcf55dee-d6e1-4e87-88e2-c943a54ab669 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 567.188931] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 567.188931] nova-compute[62208]: warnings.warn( [ 567.193193] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 567.193359] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 567.194118] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 567.194407] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1a5f25ce-be3c-4178-b02c-61051ae1598f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 567.195969] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 567.195969] nova-compute[62208]: warnings.warn( [ 567.199401] nova-compute[62208]: DEBUG oslo_vmware.api [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Waiting for the task: (returnval){ [ 567.199401] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522e0f2d-84fa-ee02-fae5-1e2a6a95a666" [ 567.199401] nova-compute[62208]: _type = "Task" [ 567.199401] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 567.202624] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 567.202624] nova-compute[62208]: warnings.warn( [ 567.209087] nova-compute[62208]: DEBUG oslo_vmware.api [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522e0f2d-84fa-ee02-fae5-1e2a6a95a666, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 567.703979] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 567.703979] nova-compute[62208]: warnings.warn( [ 567.710213] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 567.710554] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 567.710783] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 567.710927] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 567.711323] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 567.711662] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-914b2b56-630d-454f-8e29-1a81773754bc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 567.713587] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 567.713587] nova-compute[62208]: warnings.warn( [ 567.730305] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 567.730502] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 567.731305] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af0cbbe9-c587-4b06-afd4-1a47f5fe0806 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 567.734525] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 567.734525] nova-compute[62208]: warnings.warn( [ 567.739210] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-634710cc-13a4-4b6e-9887-5be71b683bd9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 567.741516] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 567.741516] nova-compute[62208]: warnings.warn( [ 567.744763] nova-compute[62208]: DEBUG oslo_vmware.api [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Waiting for the task: (returnval){ [ 567.744763] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5279173a-00e2-a707-d7d8-a7f03bfeabe7" [ 567.744763] nova-compute[62208]: _type = "Task" [ 567.744763] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 567.747602] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 567.747602] nova-compute[62208]: warnings.warn( [ 567.752578] nova-compute[62208]: DEBUG oslo_vmware.api [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5279173a-00e2-a707-d7d8-a7f03bfeabe7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 568.249652] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 568.249652] nova-compute[62208]: warnings.warn( [ 568.255754] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 568.256048] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Creating directory with path [datastore2] vmware_temp/b9ec835c-8a00-47d7-bfe0-0d106b851120/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 568.256280] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8f2d4394-afaa-4821-bc17-a4152fe3f827 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.257994] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 568.257994] nova-compute[62208]: warnings.warn( [ 568.278476] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Created directory with path [datastore2] vmware_temp/b9ec835c-8a00-47d7-bfe0-0d106b851120/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 568.278664] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Fetch image to [datastore2] vmware_temp/b9ec835c-8a00-47d7-bfe0-0d106b851120/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 568.278831] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/b9ec835c-8a00-47d7-bfe0-0d106b851120/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 568.279609] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad589879-9803-4d56-b4aa-01363d32fb15 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.282275] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 568.282275] nova-compute[62208]: warnings.warn( [ 568.287721] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7384602-6552-4d2d-bfc8-da8e5934bdf1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.290061] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 568.290061] nova-compute[62208]: warnings.warn( [ 568.298474] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c895882-d7cd-4276-9e2e-07abe42e4cf9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.302339] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 568.302339] nova-compute[62208]: warnings.warn( [ 568.331872] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61feac82-5652-45c3-9a1c-36bb4262594d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.334457] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 568.334457] nova-compute[62208]: warnings.warn( [ 568.339357] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f9e28f22-b67c-48d8-b6fe-154f22c5fb70 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.341173] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 568.341173] nova-compute[62208]: warnings.warn( [ 568.370468] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 568.447427] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b9ec835c-8a00-47d7-bfe0-0d106b851120/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 568.516796] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 568.517136] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b9ec835c-8a00-47d7-bfe0-0d106b851120/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 569.461844] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Acquiring lock "f7e43c56-e126-4e5a-944a-bba89f2f9744" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 569.462220] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Lock "f7e43c56-e126-4e5a-944a-bba89f2f9744" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 569.480170] nova-compute[62208]: DEBUG nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 569.540195] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 569.540453] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 569.541958] nova-compute[62208]: INFO nova.compute.claims [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 569.675331] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72c63861-94a1-467c-b7b1-8f0c3c59bf75 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.682567] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 569.682567] nova-compute[62208]: warnings.warn( [ 569.692765] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fb08f51-9d76-4b95-a96f-89ed574c8eaf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.698902] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 569.698902] nova-compute[62208]: warnings.warn( [ 569.731585] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe34df71-ebbf-4b02-968b-0f505fd8815e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.734159] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 569.734159] nova-compute[62208]: warnings.warn( [ 569.740336] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcb752f6-4075-42e4-96ad-3b244ad2aca4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.750417] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 569.750417] nova-compute[62208]: warnings.warn( [ 569.761910] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 569.771307] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 569.791833] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 569.792366] nova-compute[62208]: DEBUG nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 569.839460] nova-compute[62208]: DEBUG nova.compute.utils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 569.840855] nova-compute[62208]: DEBUG nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 569.840937] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 569.855586] nova-compute[62208]: DEBUG nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 569.904309] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] No network configured {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1188}} [ 569.904686] nova-compute[62208]: DEBUG nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Instance network_info: |[]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 569.941293] nova-compute[62208]: DEBUG nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 569.975255] nova-compute[62208]: DEBUG nova.virt.hardware [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 569.975577] nova-compute[62208]: DEBUG nova.virt.hardware [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 569.975785] nova-compute[62208]: DEBUG nova.virt.hardware [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 569.975967] nova-compute[62208]: DEBUG nova.virt.hardware [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 569.976152] nova-compute[62208]: DEBUG nova.virt.hardware [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 569.976302] nova-compute[62208]: DEBUG nova.virt.hardware [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 569.976521] nova-compute[62208]: DEBUG nova.virt.hardware [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 569.976692] nova-compute[62208]: DEBUG nova.virt.hardware [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 569.976868] nova-compute[62208]: DEBUG nova.virt.hardware [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 569.977034] nova-compute[62208]: DEBUG nova.virt.hardware [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 569.977201] nova-compute[62208]: DEBUG nova.virt.hardware [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 569.978229] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0709c191-394a-42bf-8cab-9933802ac3aa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.980844] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 569.980844] nova-compute[62208]: warnings.warn( [ 569.987428] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34e2ad24-edab-45a3-80f6-39ade5526b8e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.992503] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 569.992503] nova-compute[62208]: warnings.warn( [ 570.003664] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 570.010484] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Creating folder: Project (115127f70b844e338dba8db4576cd0cb). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 570.010866] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-16c6f2d4-5521-4297-b508-9350177cac44 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.013411] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 570.013411] nova-compute[62208]: warnings.warn( [ 570.025422] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Created folder: Project (115127f70b844e338dba8db4576cd0cb) in parent group-v17427. [ 570.025716] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Creating folder: Instances. Parent ref: group-v17431. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 570.026037] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7281709e-4ed7-4da1-b6e4-b7396e1b606d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.027789] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 570.027789] nova-compute[62208]: warnings.warn( [ 570.039606] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Created folder: Instances in parent group-v17431. [ 570.040183] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 570.040455] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 570.040713] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e16b8add-e0bc-4014-94ba-e71d687627d5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.053879] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 570.053879] nova-compute[62208]: warnings.warn( [ 570.059555] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 570.059555] nova-compute[62208]: value = "task-38364" [ 570.059555] nova-compute[62208]: _type = "Task" [ 570.059555] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 570.063567] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 570.063567] nova-compute[62208]: warnings.warn( [ 570.069014] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38364, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 570.565767] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 570.565767] nova-compute[62208]: warnings.warn( [ 570.570396] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38364, 'name': CreateVM_Task, 'duration_secs': 0.267935} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 570.570471] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 570.570901] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 570.571169] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 570.574096] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3498527-3cb7-4168-a16c-3aad53c428fa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.592001] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 570.592001] nova-compute[62208]: warnings.warn( [ 570.610472] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Reconfiguring VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 570.610859] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-c652269c-5788-4020-86be-aa6a49582bca {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.621334] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 570.621334] nova-compute[62208]: warnings.warn( [ 570.627335] nova-compute[62208]: DEBUG oslo_vmware.api [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Waiting for the task: (returnval){ [ 570.627335] nova-compute[62208]: value = "task-38365" [ 570.627335] nova-compute[62208]: _type = "Task" [ 570.627335] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 570.630633] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 570.630633] nova-compute[62208]: warnings.warn( [ 570.637019] nova-compute[62208]: DEBUG oslo_vmware.api [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Task: {'id': task-38365, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 571.131691] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 571.131691] nova-compute[62208]: warnings.warn( [ 571.140904] nova-compute[62208]: DEBUG oslo_vmware.api [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Task: {'id': task-38365, 'name': ReconfigVM_Task, 'duration_secs': 0.106707} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 571.140904] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Reconfigured VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 571.140904] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.567s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 571.140904] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 571.141048] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 571.141048] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 571.141048] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e3588216-6fac-425b-baac-57636b2bae22 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.141048] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 571.141048] nova-compute[62208]: warnings.warn( [ 571.145562] nova-compute[62208]: DEBUG oslo_vmware.api [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Waiting for the task: (returnval){ [ 571.145562] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52bbc8d3-64b4-76e1-e0fa-49a3d11f54c9" [ 571.145562] nova-compute[62208]: _type = "Task" [ 571.145562] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 571.149845] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 571.149845] nova-compute[62208]: warnings.warn( [ 571.155157] nova-compute[62208]: DEBUG oslo_vmware.api [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52bbc8d3-64b4-76e1-e0fa-49a3d11f54c9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 571.650802] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 571.650802] nova-compute[62208]: warnings.warn( [ 571.656912] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 571.657582] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 571.658045] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 572.853803] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Acquiring lock "f9954bd1-8df3-445c-bb4c-ee316b7b0447" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 572.854497] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Lock "f9954bd1-8df3-445c-bb4c-ee316b7b0447" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 572.871562] nova-compute[62208]: DEBUG nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 573.064314] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 573.064557] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 573.066408] nova-compute[62208]: INFO nova.compute.claims [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 573.177906] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4ee1819-61e3-4acf-b630-d5bee40135b9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.180675] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 573.180675] nova-compute[62208]: warnings.warn( [ 573.187021] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-501231fe-65a7-4515-826f-e5d45511a1a8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.190011] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 573.190011] nova-compute[62208]: warnings.warn( [ 573.217393] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1434cce0-4733-4f6c-b432-e22f05d9d706 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.219604] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 573.219604] nova-compute[62208]: warnings.warn( [ 573.225468] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02768f0f-2a23-4fc1-9ec2-41abeb5a367b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.229352] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 573.229352] nova-compute[62208]: warnings.warn( [ 573.240267] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 573.249711] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 573.264765] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.200s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 573.265572] nova-compute[62208]: DEBUG nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 573.312489] nova-compute[62208]: DEBUG nova.compute.utils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 573.314145] nova-compute[62208]: DEBUG nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Not allocating networking since 'none' was specified. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1968}} [ 573.331950] nova-compute[62208]: DEBUG nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 573.422625] nova-compute[62208]: DEBUG nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 573.447299] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 573.447838] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 573.448301] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 573.448733] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 573.449075] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 573.449370] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 573.449776] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 573.450110] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 573.450521] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 573.450897] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 573.451220] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 573.452651] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-799508ba-141a-4753-8d1a-a6052c716bf8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.456254] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 573.456254] nova-compute[62208]: warnings.warn( [ 573.463535] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0193e05d-53ad-45a6-83d0-29f86fdb8836 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.469269] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 573.469269] nova-compute[62208]: warnings.warn( [ 573.483381] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 573.491433] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Creating folder: Project (1d4641fdf8474c018bed5274ccbbff34). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 573.494659] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fad56f52-64a4-4774-ab2d-1929e2718baf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.500306] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 573.500306] nova-compute[62208]: warnings.warn( [ 573.512439] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Created folder: Project (1d4641fdf8474c018bed5274ccbbff34) in parent group-v17427. [ 573.512642] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Creating folder: Instances. Parent ref: group-v17434. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 573.512906] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-eeb3c162-1bd4-425f-abf1-1c7e138ed329 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.514612] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 573.514612] nova-compute[62208]: warnings.warn( [ 573.524161] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Created folder: Instances in parent group-v17434. [ 573.524434] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 573.524630] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 573.524832] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-06c16307-f95a-4290-9027-0d5b2df685b8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.538879] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 573.538879] nova-compute[62208]: warnings.warn( [ 573.546237] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 573.546237] nova-compute[62208]: value = "task-38368" [ 573.546237] nova-compute[62208]: _type = "Task" [ 573.546237] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 573.550691] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 573.550691] nova-compute[62208]: warnings.warn( [ 573.558730] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38368, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 574.051115] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.051115] nova-compute[62208]: warnings.warn( [ 574.057183] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38368, 'name': CreateVM_Task, 'duration_secs': 0.263894} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 574.057732] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 574.057732] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 574.057970] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 574.061218] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3af4ff9-8d3b-4b78-8a12-0f8e8cb4ce80 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.078473] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.078473] nova-compute[62208]: warnings.warn( [ 574.101549] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Reconfiguring VM instance to enable vnc on port - 5902 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 574.102257] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-eac29531-d31e-4817-96b0-cbea09a8ab71 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.118400] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.118400] nova-compute[62208]: warnings.warn( [ 574.125706] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Waiting for the task: (returnval){ [ 574.125706] nova-compute[62208]: value = "task-38369" [ 574.125706] nova-compute[62208]: _type = "Task" [ 574.125706] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 574.129613] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.129613] nova-compute[62208]: warnings.warn( [ 574.135249] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Task: {'id': task-38369, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 574.339237] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 574.339394] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 574.339580] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 574.339701] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 574.355246] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 574.355449] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 574.355601] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 574.355728] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 574.356237] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 574.356879] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 574.357108] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 574.357297] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 574.357531] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 574.357702] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_power_states {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 574.379285] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Getting list of instances from cluster (obj){ [ 574.379285] nova-compute[62208]: value = "domain-c8" [ 574.379285] nova-compute[62208]: _type = "ClusterComputeResource" [ 574.379285] nova-compute[62208]: } {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 574.380720] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c854cb9-f10a-4798-861a-cc7bc43762e9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.386617] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.386617] nova-compute[62208]: warnings.warn( [ 574.398086] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Got total of 3 instances {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 574.398325] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 53e0e94e-e81c-44b0-bb52-18759172d614 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 574.398530] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid f7e43c56-e126-4e5a-944a-bba89f2f9744 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 574.398678] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid f9954bd1-8df3-445c-bb4c-ee316b7b0447 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 574.399026] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "53e0e94e-e81c-44b0-bb52-18759172d614" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 574.399257] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "f7e43c56-e126-4e5a-944a-bba89f2f9744" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 574.399454] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "f9954bd1-8df3-445c-bb4c-ee316b7b0447" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 574.399657] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 574.399903] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 574.400091] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 574.410435] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 574.410658] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 574.410817] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 574.410969] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 574.412176] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7463efc-03f1-4dfc-9abb-022f918678c7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.415485] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.415485] nova-compute[62208]: warnings.warn( [ 574.421729] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65ceb60c-bc6e-4a2e-a219-05c39ca1d806 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.427263] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.427263] nova-compute[62208]: warnings.warn( [ 574.438630] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-670b7013-5de6-4b56-bda1-d749405e11f8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.445682] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.445682] nova-compute[62208]: warnings.warn( [ 574.451082] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-559f7e25-ca0f-47fb-8940-99964e8400d1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.454371] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.454371] nova-compute[62208]: warnings.warn( [ 574.493157] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181977MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 574.493157] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 574.493157] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 574.589395] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 53e0e94e-e81c-44b0-bb52-18759172d614 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 574.589464] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f7e43c56-e126-4e5a-944a-bba89f2f9744 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 574.589551] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f9954bd1-8df3-445c-bb4c-ee316b7b0447 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 574.590075] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 574.590075] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 574.633187] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.633187] nova-compute[62208]: warnings.warn( [ 574.643507] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Task: {'id': task-38369, 'name': ReconfigVM_Task} progress is 14%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 574.673831] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f803be8-c895-4fb3-af09-10144f9936f3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.676433] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.676433] nova-compute[62208]: warnings.warn( [ 574.682155] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-964960e8-aab2-4776-b170-08e0087edbe4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.685430] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.685430] nova-compute[62208]: warnings.warn( [ 574.722103] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-456b8fcc-908a-4190-be8f-4760a31be9ad {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.724765] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.724765] nova-compute[62208]: warnings.warn( [ 574.730657] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ccfe206-4a2d-4be3-813a-db14084cb0ce {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.734776] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.734776] nova-compute[62208]: warnings.warn( [ 574.746074] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 574.764702] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 574.782824] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 574.783018] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.291s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 574.783232] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 574.783601] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Getting list of instances from cluster (obj){ [ 574.783601] nova-compute[62208]: value = "domain-c8" [ 574.783601] nova-compute[62208]: _type = "ClusterComputeResource" [ 574.783601] nova-compute[62208]: } {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 574.784884] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ba8f0d7-75b8-443b-88b5-a89e99f2089e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.788552] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 574.788552] nova-compute[62208]: warnings.warn( [ 574.800177] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Got total of 3 instances {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 575.133494] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 575.133494] nova-compute[62208]: warnings.warn( [ 575.140584] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Task: {'id': task-38369, 'name': ReconfigVM_Task, 'duration_secs': 0.749923} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 575.140970] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Reconfigured VM instance to enable vnc on port - 5902 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 575.141252] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 1.083s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 575.141775] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 575.141775] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 575.142109] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 575.142365] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-da4e0407-bd39-499c-98cb-570d27e9ce80 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.144171] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 575.144171] nova-compute[62208]: warnings.warn( [ 575.148364] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Waiting for the task: (returnval){ [ 575.148364] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e8065e-3cf3-c9e8-3550-bf5b4437d529" [ 575.148364] nova-compute[62208]: _type = "Task" [ 575.148364] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 575.151782] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 575.151782] nova-compute[62208]: warnings.warn( [ 575.158656] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e8065e-3cf3-c9e8-3550-bf5b4437d529, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 575.653028] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 575.653028] nova-compute[62208]: warnings.warn( [ 575.659912] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 575.660270] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 575.660439] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 579.232097] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "99698b8b-8a66-46ce-8bf1-cc00239e644b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 579.232097] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "99698b8b-8a66-46ce-8bf1-cc00239e644b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 579.248338] nova-compute[62208]: DEBUG nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 579.315014] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 579.315310] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 579.316776] nova-compute[62208]: INFO nova.compute.claims [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 579.484658] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdab0e7c-ce30-4c46-b33c-c7eb66e54874 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.486300] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 579.486300] nova-compute[62208]: warnings.warn( [ 579.493480] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5ae567e-afac-42f2-8d56-467f7f3f296e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.495466] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 579.495466] nova-compute[62208]: warnings.warn( [ 579.540743] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-513882db-afab-430f-b7a8-95a1e7f93d47 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.552030] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 579.552030] nova-compute[62208]: warnings.warn( [ 579.555903] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78f13274-d429-4955-bb34-1611df8bb4fa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.563538] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 579.563538] nova-compute[62208]: warnings.warn( [ 579.575319] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 579.607563] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 579.652271] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.337s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 579.652810] nova-compute[62208]: DEBUG nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 579.729463] nova-compute[62208]: DEBUG nova.compute.utils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 579.731483] nova-compute[62208]: DEBUG nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 579.731483] nova-compute[62208]: DEBUG nova.network.neutron [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 579.769274] nova-compute[62208]: DEBUG nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 579.891529] nova-compute[62208]: DEBUG nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 579.922114] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 579.922360] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 579.922527] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 579.922711] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 579.922856] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 579.923009] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 579.923215] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 579.923371] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 579.923536] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 579.925406] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 579.925406] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 579.925406] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f66cea4-c7fc-4571-8e4a-e578b01aadab {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.927241] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 579.927241] nova-compute[62208]: warnings.warn( [ 579.934221] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11def3e6-9e2c-49c9-9a9b-cc9c1c1011f9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.946819] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 579.946819] nova-compute[62208]: warnings.warn( [ 580.039528] nova-compute[62208]: DEBUG nova.network.neutron [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] No network configured {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1188}} [ 580.039711] nova-compute[62208]: DEBUG nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Instance network_info: |[]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 580.040069] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 580.046253] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Creating folder: Project (51f4313c66b44c4581d18b7aaf5acd71). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 580.046692] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6d79c2b1-eb6b-4f86-a537-0ea22a11fd6d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.049044] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.049044] nova-compute[62208]: warnings.warn( [ 580.059524] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Created folder: Project (51f4313c66b44c4581d18b7aaf5acd71) in parent group-v17427. [ 580.059694] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Creating folder: Instances. Parent ref: group-v17437. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 580.059961] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-830ac382-e9bc-4c36-bb75-14a9b74b6311 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.061629] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.061629] nova-compute[62208]: warnings.warn( [ 580.070701] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Created folder: Instances in parent group-v17437. [ 580.070930] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 580.071122] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 580.071318] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f3cdecb4-0c70-450e-aa9f-31cf24bae82f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.084482] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.084482] nova-compute[62208]: warnings.warn( [ 580.090637] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 580.090637] nova-compute[62208]: value = "task-38372" [ 580.090637] nova-compute[62208]: _type = "Task" [ 580.090637] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 580.093812] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.093812] nova-compute[62208]: warnings.warn( [ 580.099224] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38372, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 580.269862] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquiring lock "2e938efc-55d2-4116-8989-354ec339579f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.269862] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Lock "2e938efc-55d2-4116-8989-354ec339579f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.296516] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquiring lock "75fd2a2c-4ef5-4b42-b309-53cff148c772" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.296762] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Lock "75fd2a2c-4ef5-4b42-b309-53cff148c772" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.298865] nova-compute[62208]: DEBUG nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 580.312764] nova-compute[62208]: DEBUG nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 580.369026] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.369280] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.370719] nova-compute[62208]: INFO nova.compute.claims [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 580.379488] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.545766] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c48e4ddc-f602-4042-857c-4244e30ba678 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.549954] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.549954] nova-compute[62208]: warnings.warn( [ 580.558125] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e52c778-16f9-4309-8e80-be7a317b8498 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.560712] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.560712] nova-compute[62208]: warnings.warn( [ 580.596218] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6cc5b79-1fa4-46d2-83b6-08259b840935 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.599434] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.599434] nova-compute[62208]: warnings.warn( [ 580.615019] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38372, 'name': CreateVM_Task, 'duration_secs': 0.274004} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 580.615289] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 580.615657] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.615914] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.619219] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6be6245d-3f12-4726-9373-b3fa9a8aa677 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.635080] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.635080] nova-compute[62208]: warnings.warn( [ 580.636331] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.636331] nova-compute[62208]: warnings.warn( [ 580.642033] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6e5844a-6887-4f44-92ad-55c50fee61ca {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.658340] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Reconfiguring VM instance to enable vnc on port - 5903 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 580.658642] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.658642] nova-compute[62208]: warnings.warn( [ 580.659286] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-29c83a8d-4eda-420c-819c-66040e8a42cd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.678421] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 580.679583] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.679583] nova-compute[62208]: warnings.warn( [ 580.687297] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for the task: (returnval){ [ 580.687297] nova-compute[62208]: value = "task-38373" [ 580.687297] nova-compute[62208]: _type = "Task" [ 580.687297] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 580.692486] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 580.696664] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.696664] nova-compute[62208]: warnings.warn( [ 580.704534] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Task: {'id': task-38373, 'name': ReconfigVM_Task} progress is 14%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 580.713675] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 580.714571] nova-compute[62208]: DEBUG nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 580.720052] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.338s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.720052] nova-compute[62208]: INFO nova.compute.claims [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 580.770153] nova-compute[62208]: DEBUG nova.compute.utils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 580.773586] nova-compute[62208]: DEBUG nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 580.773826] nova-compute[62208]: DEBUG nova.network.neutron [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 580.803436] nova-compute[62208]: DEBUG nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 580.911212] nova-compute[62208]: DEBUG nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 580.945424] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6101e20c-e454-41b8-bf1f-4394f4e8017d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.949356] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.949356] nova-compute[62208]: warnings.warn( [ 580.957511] nova-compute[62208]: DEBUG nova.virt.hardware [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 580.957990] nova-compute[62208]: DEBUG nova.virt.hardware [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 580.958271] nova-compute[62208]: DEBUG nova.virt.hardware [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 580.958633] nova-compute[62208]: DEBUG nova.virt.hardware [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 580.958894] nova-compute[62208]: DEBUG nova.virt.hardware [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 580.959140] nova-compute[62208]: DEBUG nova.virt.hardware [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 580.959546] nova-compute[62208]: DEBUG nova.virt.hardware [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 580.959808] nova-compute[62208]: DEBUG nova.virt.hardware [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 580.960098] nova-compute[62208]: DEBUG nova.virt.hardware [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 580.960366] nova-compute[62208]: DEBUG nova.virt.hardware [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 580.960663] nova-compute[62208]: DEBUG nova.virt.hardware [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 580.966140] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c88d043d-839c-4133-b739-bfb047aa34fa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.974321] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f09886a8-bdbe-46f0-93e7-637dc914381f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.977760] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.977760] nova-compute[62208]: warnings.warn( [ 580.978165] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 580.978165] nova-compute[62208]: warnings.warn( [ 581.011073] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0143c3e-beb0-4b62-9804-56b94eda43e1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.019263] nova-compute[62208]: DEBUG nova.network.neutron [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] No network configured {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1188}} [ 581.019454] nova-compute[62208]: DEBUG nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Instance network_info: |[]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 581.020434] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84d8b6ab-62f9-4681-8e4a-0058dffb03b9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.022623] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.022623] nova-compute[62208]: warnings.warn( [ 581.023006] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.023006] nova-compute[62208]: warnings.warn( [ 581.036084] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 581.043267] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Creating folder: Project (50c170638d3a497e96b4a5e316fd1509). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 581.047026] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3ea1a536-433f-4927-b31b-ad6a57b355ae {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.050516] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc1f96e2-7a4a-4f2d-b64f-df3d318d8325 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.056749] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.056749] nova-compute[62208]: warnings.warn( [ 581.057144] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.057144] nova-compute[62208]: warnings.warn( [ 581.069542] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 581.076083] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Created folder: Project (50c170638d3a497e96b4a5e316fd1509) in parent group-v17427. [ 581.076083] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Creating folder: Instances. Parent ref: group-v17440. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 581.076083] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2dbf023a-7bfd-4616-af84-25850b7ebc11 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.076083] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.076083] nova-compute[62208]: warnings.warn( [ 581.082285] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 581.088031] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Created folder: Instances in parent group-v17440. [ 581.088258] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 581.088773] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 581.089174] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bc763cbf-083c-4c1f-8666-38d73013f7fa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.107230] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.390s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 581.107754] nova-compute[62208]: DEBUG nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 581.110525] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.110525] nova-compute[62208]: warnings.warn( [ 581.116511] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 581.116511] nova-compute[62208]: value = "task-38376" [ 581.116511] nova-compute[62208]: _type = "Task" [ 581.116511] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 581.120076] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.120076] nova-compute[62208]: warnings.warn( [ 581.127623] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38376, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 581.160473] nova-compute[62208]: DEBUG nova.compute.utils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 581.160473] nova-compute[62208]: DEBUG nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 581.160473] nova-compute[62208]: DEBUG nova.network.neutron [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 581.172969] nova-compute[62208]: DEBUG nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 581.192621] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.192621] nova-compute[62208]: warnings.warn( [ 581.199001] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Task: {'id': task-38373, 'name': ReconfigVM_Task, 'duration_secs': 0.105343} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 581.199369] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Reconfigured VM instance to enable vnc on port - 5903 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 581.199640] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.584s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 581.199935] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 581.200147] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 581.200774] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 581.200849] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2bdff959-5f36-41eb-8764-7b1959e7d34c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.202650] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.202650] nova-compute[62208]: warnings.warn( [ 581.206791] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for the task: (returnval){ [ 581.206791] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52d49332-992d-24f0-2f06-cefb5def4248" [ 581.206791] nova-compute[62208]: _type = "Task" [ 581.206791] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 581.211452] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.211452] nova-compute[62208]: warnings.warn( [ 581.217384] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52d49332-992d-24f0-2f06-cefb5def4248, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 581.258694] nova-compute[62208]: DEBUG nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 581.281958] nova-compute[62208]: DEBUG nova.virt.hardware [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 581.282358] nova-compute[62208]: DEBUG nova.virt.hardware [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 581.282571] nova-compute[62208]: DEBUG nova.virt.hardware [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 581.282786] nova-compute[62208]: DEBUG nova.virt.hardware [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 581.282952] nova-compute[62208]: DEBUG nova.virt.hardware [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 581.283159] nova-compute[62208]: DEBUG nova.virt.hardware [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 581.283400] nova-compute[62208]: DEBUG nova.virt.hardware [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 581.283842] nova-compute[62208]: DEBUG nova.virt.hardware [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 581.283842] nova-compute[62208]: DEBUG nova.virt.hardware [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 581.283979] nova-compute[62208]: DEBUG nova.virt.hardware [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 581.284171] nova-compute[62208]: DEBUG nova.virt.hardware [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 581.285067] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3328f109-3371-48c0-b58b-87b395ac6313 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.288350] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.288350] nova-compute[62208]: warnings.warn( [ 581.295149] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad8a98d5-f1d4-4a1e-b3fb-3639c8173d46 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.299538] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.299538] nova-compute[62208]: warnings.warn( [ 581.402853] nova-compute[62208]: DEBUG nova.network.neutron [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] No network configured {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1188}} [ 581.402950] nova-compute[62208]: DEBUG nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Instance network_info: |[]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 581.403261] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 581.410326] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Creating folder: Project (9c3230a0ff3042ac9871fc0af1f5c40b). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 581.410326] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0e0593ef-21a1-4d03-b428-bdf60a3871d5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.411143] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.411143] nova-compute[62208]: warnings.warn( [ 581.424576] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Created folder: Project (9c3230a0ff3042ac9871fc0af1f5c40b) in parent group-v17427. [ 581.424774] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Creating folder: Instances. Parent ref: group-v17443. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 581.425021] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4afdb969-1b7e-4cb0-a3d9-b096c5cd32ad {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.426724] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.426724] nova-compute[62208]: warnings.warn( [ 581.436820] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Created folder: Instances in parent group-v17443. [ 581.437169] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 581.437380] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 581.438343] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-900f9316-6cd9-426a-b2b9-29205746f4e7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.452131] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.452131] nova-compute[62208]: warnings.warn( [ 581.461656] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 581.461656] nova-compute[62208]: value = "task-38379" [ 581.461656] nova-compute[62208]: _type = "Task" [ 581.461656] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 581.466328] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.466328] nova-compute[62208]: warnings.warn( [ 581.476542] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38379, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 581.620358] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.620358] nova-compute[62208]: warnings.warn( [ 581.627164] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38376, 'name': CreateVM_Task, 'duration_secs': 0.288532} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 581.627286] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 581.627691] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 581.628111] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 581.631095] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efdca8e8-15e9-4538-99f3-09594b9230ca {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.650210] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.650210] nova-compute[62208]: warnings.warn( [ 581.671008] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Reconfiguring VM instance to enable vnc on port - 5904 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 581.671510] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-536db920-dc4c-4407-a8dc-a5180b7f36ad {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.683951] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.683951] nova-compute[62208]: warnings.warn( [ 581.692990] nova-compute[62208]: DEBUG oslo_vmware.api [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Waiting for the task: (returnval){ [ 581.692990] nova-compute[62208]: value = "task-38380" [ 581.692990] nova-compute[62208]: _type = "Task" [ 581.692990] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 581.696828] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.696828] nova-compute[62208]: warnings.warn( [ 581.703403] nova-compute[62208]: DEBUG oslo_vmware.api [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Task: {'id': task-38380, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 581.717933] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.717933] nova-compute[62208]: warnings.warn( [ 581.725419] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 581.726084] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 581.726307] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 581.969918] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 581.969918] nova-compute[62208]: warnings.warn( [ 581.976721] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38379, 'name': CreateVM_Task, 'duration_secs': 0.273809} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 581.976842] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 581.977096] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 582.074505] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Acquiring lock "4ca19153-519c-49e3-bdfd-1f5ea77b24a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 582.074732] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Lock "4ca19153-519c-49e3-bdfd-1f5ea77b24a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 582.098685] nova-compute[62208]: DEBUG nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 582.176566] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 582.176566] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 582.178025] nova-compute[62208]: INFO nova.compute.claims [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 582.198972] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.198972] nova-compute[62208]: warnings.warn( [ 582.205704] nova-compute[62208]: DEBUG oslo_vmware.api [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Task: {'id': task-38380, 'name': ReconfigVM_Task, 'duration_secs': 0.121436} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 582.206028] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Reconfigured VM instance to enable vnc on port - 5904 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 582.206487] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.578s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 582.206487] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 582.206602] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 582.206897] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 582.207174] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.230s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 582.207382] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-433d7dec-a45e-4455-a581-b6016179442b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.211950] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62c6e74e-c25e-4e06-984f-ed97620d3284 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.227956] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.227956] nova-compute[62208]: warnings.warn( [ 582.229194] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.229194] nova-compute[62208]: warnings.warn( [ 582.233267] nova-compute[62208]: DEBUG oslo_vmware.api [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Waiting for the task: (returnval){ [ 582.233267] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52cafbef-676b-1356-c987-2767d24ea995" [ 582.233267] nova-compute[62208]: _type = "Task" [ 582.233267] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 582.259218] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Reconfiguring VM instance to enable vnc on port - 5905 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 582.259524] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.259524] nova-compute[62208]: warnings.warn( [ 582.264514] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-b77c4021-1d56-48cc-80ba-720ce95e5d0b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.298157] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.298157] nova-compute[62208]: warnings.warn( [ 582.298157] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 582.298157] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 582.298157] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 582.304600] nova-compute[62208]: DEBUG oslo_vmware.api [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Waiting for the task: (returnval){ [ 582.304600] nova-compute[62208]: value = "task-38381" [ 582.304600] nova-compute[62208]: _type = "Task" [ 582.304600] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 582.309275] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.309275] nova-compute[62208]: warnings.warn( [ 582.320675] nova-compute[62208]: DEBUG oslo_vmware.api [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Task: {'id': task-38381, 'name': ReconfigVM_Task} progress is 10%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 582.418540] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d815f4a1-2372-4846-999d-8aa2a16c345d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.421208] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.421208] nova-compute[62208]: warnings.warn( [ 582.427201] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be21abd8-a92f-418a-9044-b8a783cc680a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.430637] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.430637] nova-compute[62208]: warnings.warn( [ 582.461816] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42fcc887-faf2-4d11-96dd-7d3eb53e7196 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.465757] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.465757] nova-compute[62208]: warnings.warn( [ 582.471690] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a279b46-2b62-4a78-9cb7-d09fc84a5fe8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.477109] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.477109] nova-compute[62208]: warnings.warn( [ 582.489162] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 582.504672] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 582.533543] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.357s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 582.534040] nova-compute[62208]: DEBUG nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 582.616173] nova-compute[62208]: DEBUG nova.compute.utils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 582.617539] nova-compute[62208]: DEBUG nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 582.617732] nova-compute[62208]: DEBUG nova.network.neutron [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 582.659724] nova-compute[62208]: DEBUG nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 582.809641] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.809641] nova-compute[62208]: warnings.warn( [ 582.815549] nova-compute[62208]: DEBUG oslo_vmware.api [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Task: {'id': task-38381, 'name': ReconfigVM_Task, 'duration_secs': 0.106767} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 582.815845] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Reconfigured VM instance to enable vnc on port - 5905 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 582.816103] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.609s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 582.816371] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 582.816788] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 582.816851] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 582.817070] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-55e018ba-39a6-4147-8f73-9b39553162a0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.818811] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.818811] nova-compute[62208]: warnings.warn( [ 582.824643] nova-compute[62208]: DEBUG oslo_vmware.api [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Waiting for the task: (returnval){ [ 582.824643] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52fce219-6c8c-19f8-bd14-7023af3bf240" [ 582.824643] nova-compute[62208]: _type = "Task" [ 582.824643] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 582.825666] nova-compute[62208]: DEBUG nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 582.831889] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.831889] nova-compute[62208]: warnings.warn( [ 582.838420] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 582.838757] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 582.839022] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 582.868205] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 582.868892] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 582.869113] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 582.869465] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 582.870104] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 582.870329] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 582.870585] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 582.874291] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 582.874540] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 582.874831] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 582.875013] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 582.876451] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b926abda-ec98-47a0-8167-2afc9035f592 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.880913] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.880913] nova-compute[62208]: warnings.warn( [ 582.887810] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7275f70-135b-40ee-a2b2-56d560546001 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.912096] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 582.912096] nova-compute[62208]: warnings.warn( [ 583.085924] nova-compute[62208]: DEBUG nova.policy [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '595aee4194d34c5f9b359ac62bed91a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1983703e2fb84d38b2b36674a2cc502a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 583.096527] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Acquiring lock "bd0eef47-56e8-45b6-92b1-e81400994572" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 583.096666] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Lock "bd0eef47-56e8-45b6-92b1-e81400994572" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 583.110975] nova-compute[62208]: DEBUG nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 583.181674] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 583.181934] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 583.183394] nova-compute[62208]: INFO nova.compute.claims [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 583.423962] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a316df3d-d513-4506-8861-3bb8ac63fa00 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.426539] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 583.426539] nova-compute[62208]: warnings.warn( [ 583.432440] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccfd854d-777e-495e-bea1-03ecdc5cdd0d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.435352] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 583.435352] nova-compute[62208]: warnings.warn( [ 583.463435] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af22c5eb-56d9-45e2-bc7d-7335de5cc77d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.466702] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 583.466702] nova-compute[62208]: warnings.warn( [ 583.474331] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-484f7b80-816e-495c-a28d-03ccf7d0a90f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.478742] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 583.478742] nova-compute[62208]: warnings.warn( [ 583.491818] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 583.503405] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 583.526411] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 583.526990] nova-compute[62208]: DEBUG nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 583.586533] nova-compute[62208]: DEBUG nova.compute.utils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 583.588744] nova-compute[62208]: DEBUG nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 583.589003] nova-compute[62208]: DEBUG nova.network.neutron [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 583.603828] nova-compute[62208]: DEBUG nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 583.709506] nova-compute[62208]: DEBUG nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 583.739738] nova-compute[62208]: DEBUG nova.virt.hardware [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 583.739738] nova-compute[62208]: DEBUG nova.virt.hardware [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 583.739924] nova-compute[62208]: DEBUG nova.virt.hardware [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 583.740025] nova-compute[62208]: DEBUG nova.virt.hardware [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 583.740178] nova-compute[62208]: DEBUG nova.virt.hardware [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 583.740317] nova-compute[62208]: DEBUG nova.virt.hardware [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 583.740531] nova-compute[62208]: DEBUG nova.virt.hardware [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 583.740741] nova-compute[62208]: DEBUG nova.virt.hardware [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 583.741033] nova-compute[62208]: DEBUG nova.virt.hardware [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 583.741122] nova-compute[62208]: DEBUG nova.virt.hardware [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 583.741236] nova-compute[62208]: DEBUG nova.virt.hardware [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 583.742121] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6400ec23-9668-429b-b03e-1587d3ba0827 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.745251] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 583.745251] nova-compute[62208]: warnings.warn( [ 583.751101] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c7d0164-bb1a-43a5-badd-14e0cd06e17a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.754840] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 583.754840] nova-compute[62208]: warnings.warn( [ 584.766518] nova-compute[62208]: DEBUG nova.policy [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95a5ddca93b94f80ad0d43ba3a94a2fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9edac09723fe4c44a281e2ca6e6ccf44', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 585.211215] nova-compute[62208]: DEBUG nova.network.neutron [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Successfully created port: d7594078-ff8d-4833-9397-153721382251 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 587.191368] nova-compute[62208]: DEBUG nova.network.neutron [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Successfully created port: b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 588.890256] nova-compute[62208]: DEBUG nova.network.neutron [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Successfully updated port: d7594078-ff8d-4833-9397-153721382251 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 588.905868] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Acquiring lock "refresh_cache-4ca19153-519c-49e3-bdfd-1f5ea77b24a0" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 588.906006] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Acquired lock "refresh_cache-4ca19153-519c-49e3-bdfd-1f5ea77b24a0" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 588.906348] nova-compute[62208]: DEBUG nova.network.neutron [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 589.161628] nova-compute[62208]: DEBUG nova.network.neutron [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 590.212543] nova-compute[62208]: DEBUG nova.network.neutron [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Updating instance_info_cache with network_info: [{"id": "d7594078-ff8d-4833-9397-153721382251", "address": "fa:16:3e:65:e8:24", "network": {"id": "988fd73b-64ec-43a2-8538-74fac80be49c", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1152551755-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "1983703e2fb84d38b2b36674a2cc502a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "53ef6889-a40c-40f5-a6e5-d8726606296a", "external-id": "nsx-vlan-transportzone-537", "segmentation_id": 537, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd7594078-ff", "ovs_interfaceid": "d7594078-ff8d-4833-9397-153721382251", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 590.233093] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Releasing lock "refresh_cache-4ca19153-519c-49e3-bdfd-1f5ea77b24a0" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 590.233262] nova-compute[62208]: DEBUG nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Instance network_info: |[{"id": "d7594078-ff8d-4833-9397-153721382251", "address": "fa:16:3e:65:e8:24", "network": {"id": "988fd73b-64ec-43a2-8538-74fac80be49c", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1152551755-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "1983703e2fb84d38b2b36674a2cc502a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "53ef6889-a40c-40f5-a6e5-d8726606296a", "external-id": "nsx-vlan-transportzone-537", "segmentation_id": 537, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd7594078-ff", "ovs_interfaceid": "d7594078-ff8d-4833-9397-153721382251", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 590.233760] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:65:e8:24', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '53ef6889-a40c-40f5-a6e5-d8726606296a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd7594078-ff8d-4833-9397-153721382251', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 590.243092] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Creating folder: Project (1983703e2fb84d38b2b36674a2cc502a). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 590.244461] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e3520be2-f0d2-4237-aa3c-e2b6800d7718 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.248744] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 590.248744] nova-compute[62208]: warnings.warn( [ 590.259826] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Created folder: Project (1983703e2fb84d38b2b36674a2cc502a) in parent group-v17427. [ 590.260069] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Creating folder: Instances. Parent ref: group-v17446. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 590.260330] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8eca547d-fbea-4675-87e0-7cda231dbc53 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.262553] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 590.262553] nova-compute[62208]: warnings.warn( [ 590.271248] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Created folder: Instances in parent group-v17446. [ 590.271517] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 590.271806] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 590.272216] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7008e538-6d28-42a7-850a-021a1895e8b9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.290545] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 590.290545] nova-compute[62208]: warnings.warn( [ 590.298625] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 590.298625] nova-compute[62208]: value = "task-38384" [ 590.298625] nova-compute[62208]: _type = "Task" [ 590.298625] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 590.302028] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 590.302028] nova-compute[62208]: warnings.warn( [ 590.307960] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38384, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 590.415173] nova-compute[62208]: DEBUG nova.compute.manager [req-407fbd51-8112-4fae-ba31-df7b1b21491c req-b0dc0a23-fb3c-4b01-a392-95931617c9b9 service nova] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Received event network-vif-plugged-d7594078-ff8d-4833-9397-153721382251 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 590.415411] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-407fbd51-8112-4fae-ba31-df7b1b21491c req-b0dc0a23-fb3c-4b01-a392-95931617c9b9 service nova] Acquiring lock "4ca19153-519c-49e3-bdfd-1f5ea77b24a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 590.415997] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-407fbd51-8112-4fae-ba31-df7b1b21491c req-b0dc0a23-fb3c-4b01-a392-95931617c9b9 service nova] Lock "4ca19153-519c-49e3-bdfd-1f5ea77b24a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 590.416209] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-407fbd51-8112-4fae-ba31-df7b1b21491c req-b0dc0a23-fb3c-4b01-a392-95931617c9b9 service nova] Lock "4ca19153-519c-49e3-bdfd-1f5ea77b24a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 590.416382] nova-compute[62208]: DEBUG nova.compute.manager [req-407fbd51-8112-4fae-ba31-df7b1b21491c req-b0dc0a23-fb3c-4b01-a392-95931617c9b9 service nova] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] No waiting events found dispatching network-vif-plugged-d7594078-ff8d-4833-9397-153721382251 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 590.416562] nova-compute[62208]: WARNING nova.compute.manager [req-407fbd51-8112-4fae-ba31-df7b1b21491c req-b0dc0a23-fb3c-4b01-a392-95931617c9b9 service nova] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Received unexpected event network-vif-plugged-d7594078-ff8d-4833-9397-153721382251 for instance with vm_state building and task_state spawning. [ 590.805881] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 590.805881] nova-compute[62208]: warnings.warn( [ 590.808666] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38384, 'name': CreateVM_Task, 'duration_secs': 0.334595} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 590.808856] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 590.846683] nova-compute[62208]: DEBUG nova.network.neutron [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Successfully updated port: b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 590.857173] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 590.857173] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 590.858162] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c907f934-4fa3-4ed0-8dc1-c436e1b64d9b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.869235] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 590.869235] nova-compute[62208]: warnings.warn( [ 590.889420] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Reconfiguring VM instance to enable vnc on port - 5906 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 590.889783] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-de0891f5-b305-426b-8099-fa215ee50635 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.906920] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 590.906920] nova-compute[62208]: warnings.warn( [ 590.909314] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Acquiring lock "refresh_cache-bd0eef47-56e8-45b6-92b1-e81400994572" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 590.909389] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Acquired lock "refresh_cache-bd0eef47-56e8-45b6-92b1-e81400994572" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 590.909484] nova-compute[62208]: DEBUG nova.network.neutron [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 590.912344] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Waiting for the task: (returnval){ [ 590.912344] nova-compute[62208]: value = "task-38385" [ 590.912344] nova-compute[62208]: _type = "Task" [ 590.912344] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 590.915919] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 590.915919] nova-compute[62208]: warnings.warn( [ 590.922424] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Task: {'id': task-38385, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 591.104042] nova-compute[62208]: DEBUG nova.network.neutron [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 591.416962] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 591.416962] nova-compute[62208]: warnings.warn( [ 591.423630] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Task: {'id': task-38385, 'name': ReconfigVM_Task, 'duration_secs': 0.144097} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 591.423912] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Reconfigured VM instance to enable vnc on port - 5906 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 591.424164] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.569s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 591.424417] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 591.425111] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 591.425224] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 591.425682] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7071a156-1d27-423c-92ba-a35d0521023d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 591.427272] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 591.427272] nova-compute[62208]: warnings.warn( [ 591.431915] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Waiting for the task: (returnval){ [ 591.431915] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52ea7fb9-e17d-dbe2-11dc-62bbc41662e3" [ 591.431915] nova-compute[62208]: _type = "Task" [ 591.431915] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 591.435393] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 591.435393] nova-compute[62208]: warnings.warn( [ 591.444950] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52ea7fb9-e17d-dbe2-11dc-62bbc41662e3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 591.858311] nova-compute[62208]: DEBUG nova.network.neutron [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Updating instance_info_cache with network_info: [{"id": "b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c", "address": "fa:16:3e:d4:a5:8a", "network": {"id": "2794f359-9ffa-4188-91e6-f75caedc8531", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1535019895-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "9edac09723fe4c44a281e2ca6e6ccf44", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "63e45f61-1d9b-4660-8d25-89fb68d45cd3", "external-id": "nsx-vlan-transportzone-43", "segmentation_id": 43, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb4e36b4f-2f", "ovs_interfaceid": "b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 591.875294] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Releasing lock "refresh_cache-bd0eef47-56e8-45b6-92b1-e81400994572" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 591.875294] nova-compute[62208]: DEBUG nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Instance network_info: |[{"id": "b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c", "address": "fa:16:3e:d4:a5:8a", "network": {"id": "2794f359-9ffa-4188-91e6-f75caedc8531", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1535019895-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "9edac09723fe4c44a281e2ca6e6ccf44", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "63e45f61-1d9b-4660-8d25-89fb68d45cd3", "external-id": "nsx-vlan-transportzone-43", "segmentation_id": 43, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb4e36b4f-2f", "ovs_interfaceid": "b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 591.875503] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d4:a5:8a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '63e45f61-1d9b-4660-8d25-89fb68d45cd3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 591.882385] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Creating folder: Project (9edac09723fe4c44a281e2ca6e6ccf44). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 591.883157] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ee56c721-da2a-4d3c-b610-1fa47399264d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 591.887512] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 591.887512] nova-compute[62208]: warnings.warn( [ 591.898059] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Created folder: Project (9edac09723fe4c44a281e2ca6e6ccf44) in parent group-v17427. [ 591.898437] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Creating folder: Instances. Parent ref: group-v17449. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 591.898817] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bca12185-a0bc-4bfa-ae78-89cac917c361 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 591.901303] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 591.901303] nova-compute[62208]: warnings.warn( [ 591.913106] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Created folder: Instances in parent group-v17449. [ 591.913106] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 591.913106] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 591.913106] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bcb0371c-5131-4bee-a298-378d27a441c7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 591.928154] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 591.928154] nova-compute[62208]: warnings.warn( [ 591.943911] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 591.943911] nova-compute[62208]: value = "task-38388" [ 591.943911] nova-compute[62208]: _type = "Task" [ 591.943911] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 591.944283] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 591.944283] nova-compute[62208]: warnings.warn( [ 591.954132] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 591.954132] nova-compute[62208]: warnings.warn( [ 591.955198] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 591.955651] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 591.955999] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 591.963363] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38388, 'name': CreateVM_Task} progress is 10%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 592.448810] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 592.448810] nova-compute[62208]: warnings.warn( [ 592.455470] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38388, 'name': CreateVM_Task, 'duration_secs': 0.356787} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 592.455683] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 592.468121] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 592.468542] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.012s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 592.472161] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec29d451-f7af-42fb-b7ad-1962f6e56ec8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.487165] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 592.487165] nova-compute[62208]: warnings.warn( [ 592.516796] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Reconfiguring VM instance to enable vnc on port - 5907 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 592.517321] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-0a8c4601-f432-430c-ad70-2d12746a8cd8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.539143] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 592.539143] nova-compute[62208]: warnings.warn( [ 592.545234] nova-compute[62208]: DEBUG oslo_vmware.api [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Waiting for the task: (returnval){ [ 592.545234] nova-compute[62208]: value = "task-38389" [ 592.545234] nova-compute[62208]: _type = "Task" [ 592.545234] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 592.549080] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 592.549080] nova-compute[62208]: warnings.warn( [ 592.555152] nova-compute[62208]: DEBUG oslo_vmware.api [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Task: {'id': task-38389, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 593.049257] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 593.049257] nova-compute[62208]: warnings.warn( [ 593.055138] nova-compute[62208]: DEBUG oslo_vmware.api [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Task: {'id': task-38389, 'name': ReconfigVM_Task, 'duration_secs': 0.1159} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 593.055474] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Reconfigured VM instance to enable vnc on port - 5907 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 593.055692] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.587s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 593.055943] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 593.056112] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 593.056525] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 593.056696] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e6820457-efb5-4f97-9dfa-daa8b0c90016 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.058379] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 593.058379] nova-compute[62208]: warnings.warn( [ 593.064628] nova-compute[62208]: DEBUG oslo_vmware.api [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Waiting for the task: (returnval){ [ 593.064628] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]523d2005-9f11-7af2-e78b-cf0bb0160777" [ 593.064628] nova-compute[62208]: _type = "Task" [ 593.064628] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 593.069459] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 593.069459] nova-compute[62208]: warnings.warn( [ 593.076450] nova-compute[62208]: DEBUG oslo_vmware.api [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]523d2005-9f11-7af2-e78b-cf0bb0160777, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 593.369628] nova-compute[62208]: DEBUG nova.compute.manager [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Received event network-changed-d7594078-ff8d-4833-9397-153721382251 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 593.369628] nova-compute[62208]: DEBUG nova.compute.manager [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Refreshing instance network info cache due to event network-changed-d7594078-ff8d-4833-9397-153721382251. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 593.369818] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] Acquiring lock "refresh_cache-4ca19153-519c-49e3-bdfd-1f5ea77b24a0" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 593.369952] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] Acquired lock "refresh_cache-4ca19153-519c-49e3-bdfd-1f5ea77b24a0" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 593.370176] nova-compute[62208]: DEBUG nova.network.neutron [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Refreshing network info cache for port d7594078-ff8d-4833-9397-153721382251 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 593.571394] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 593.571394] nova-compute[62208]: warnings.warn( [ 593.578220] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 593.578478] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 593.578686] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 594.162449] nova-compute[62208]: DEBUG nova.network.neutron [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Updated VIF entry in instance network info cache for port d7594078-ff8d-4833-9397-153721382251. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 594.162683] nova-compute[62208]: DEBUG nova.network.neutron [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Updating instance_info_cache with network_info: [{"id": "d7594078-ff8d-4833-9397-153721382251", "address": "fa:16:3e:65:e8:24", "network": {"id": "988fd73b-64ec-43a2-8538-74fac80be49c", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1152551755-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "1983703e2fb84d38b2b36674a2cc502a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "53ef6889-a40c-40f5-a6e5-d8726606296a", "external-id": "nsx-vlan-transportzone-537", "segmentation_id": 537, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd7594078-ff", "ovs_interfaceid": "d7594078-ff8d-4833-9397-153721382251", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 594.173117] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] Releasing lock "refresh_cache-4ca19153-519c-49e3-bdfd-1f5ea77b24a0" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 594.173371] nova-compute[62208]: DEBUG nova.compute.manager [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Received event network-vif-plugged-b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 594.173594] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] Acquiring lock "bd0eef47-56e8-45b6-92b1-e81400994572-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 594.173803] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] Lock "bd0eef47-56e8-45b6-92b1-e81400994572-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 594.177671] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] Lock "bd0eef47-56e8-45b6-92b1-e81400994572-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 594.177671] nova-compute[62208]: DEBUG nova.compute.manager [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] No waiting events found dispatching network-vif-plugged-b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 594.177671] nova-compute[62208]: WARNING nova.compute.manager [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Received unexpected event network-vif-plugged-b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c for instance with vm_state building and task_state spawning. [ 594.177671] nova-compute[62208]: DEBUG nova.compute.manager [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Received event network-changed-b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 594.177894] nova-compute[62208]: DEBUG nova.compute.manager [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Refreshing instance network info cache due to event network-changed-b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 594.177894] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] Acquiring lock "refresh_cache-bd0eef47-56e8-45b6-92b1-e81400994572" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 594.177894] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] Acquired lock "refresh_cache-bd0eef47-56e8-45b6-92b1-e81400994572" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 594.177894] nova-compute[62208]: DEBUG nova.network.neutron [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Refreshing network info cache for port b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 595.107665] nova-compute[62208]: DEBUG nova.network.neutron [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Updated VIF entry in instance network info cache for port b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 595.108215] nova-compute[62208]: DEBUG nova.network.neutron [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Updating instance_info_cache with network_info: [{"id": "b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c", "address": "fa:16:3e:d4:a5:8a", "network": {"id": "2794f359-9ffa-4188-91e6-f75caedc8531", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1535019895-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "9edac09723fe4c44a281e2ca6e6ccf44", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "63e45f61-1d9b-4660-8d25-89fb68d45cd3", "external-id": "nsx-vlan-transportzone-43", "segmentation_id": 43, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb4e36b4f-2f", "ovs_interfaceid": "b4e36b4f-2fac-4d47-ae9c-0cdbbd7e138c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 595.123708] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-89928fd7-3bb6-4a70-a6e8-f925c50709ea req-314612e2-a3d5-48d2-9f91-aa21cdc28b3e service nova] Releasing lock "refresh_cache-bd0eef47-56e8-45b6-92b1-e81400994572" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 599.928522] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Acquiring lock "aeb4f446-da3f-4422-a3cd-013b8d6dd174" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 599.928806] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Lock "aeb4f446-da3f-4422-a3cd-013b8d6dd174" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 599.959273] nova-compute[62208]: DEBUG nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 600.134725] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 600.135000] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 600.136562] nova-compute[62208]: INFO nova.compute.claims [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 600.396820] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3c0075e-3bcb-4125-a863-7591f0f5922e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 600.399425] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 600.399425] nova-compute[62208]: warnings.warn( [ 600.405912] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98a6f602-3738-4c84-a03f-cd8d7eea3026 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 600.412275] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 600.412275] nova-compute[62208]: warnings.warn( [ 600.443493] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea755824-5548-434a-ba17-8c2735a733da {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 600.446615] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 600.446615] nova-compute[62208]: warnings.warn( [ 600.452460] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-001d55b2-747b-4c87-ac1c-51bd3c3613cb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 600.457968] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 600.457968] nova-compute[62208]: warnings.warn( [ 600.471866] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 600.483318] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 600.517992] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.382s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 600.517992] nova-compute[62208]: DEBUG nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 600.584667] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Acquiring lock "29ee4419-201b-4d3a-8e7b-c84a5eb36d1f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 600.585067] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Lock "29ee4419-201b-4d3a-8e7b-c84a5eb36d1f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 600.596286] nova-compute[62208]: DEBUG nova.compute.utils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 600.596286] nova-compute[62208]: DEBUG nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Not allocating networking since 'none' was specified. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1968}} [ 600.612305] nova-compute[62208]: DEBUG nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 600.642967] nova-compute[62208]: DEBUG nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 600.735787] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 600.736095] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 600.738284] nova-compute[62208]: INFO nova.compute.claims [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 600.746823] nova-compute[62208]: DEBUG nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 600.787904] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 600.788208] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 600.788365] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 600.788612] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 600.788781] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 600.788927] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 600.789143] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 600.789297] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 600.789463] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 600.789644] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 600.789903] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 600.790808] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e720aa3-5caf-4223-98c2-d4a0609945d1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 600.793601] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 600.793601] nova-compute[62208]: warnings.warn( [ 600.800581] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25fc036b-da56-4949-ba30-5b623ffb2b51 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 600.811070] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 600.811070] nova-compute[62208]: warnings.warn( [ 600.834437] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 600.845349] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Creating folder: Project (05bd8a64dc98492c950a42963e6c1d8f). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 600.856462] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5e19c61e-5e0b-430e-a354-e27a6ee19955 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 600.861456] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 600.861456] nova-compute[62208]: warnings.warn( [ 600.880504] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Created folder: Project (05bd8a64dc98492c950a42963e6c1d8f) in parent group-v17427. [ 600.880874] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Creating folder: Instances. Parent ref: group-v17455. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 600.881253] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1907c9a8-e509-4446-a254-a6b14399dbd0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 600.888840] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 600.888840] nova-compute[62208]: warnings.warn( [ 600.899929] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Created folder: Instances in parent group-v17455. [ 600.900414] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 600.900762] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 600.901114] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3af09032-9c67-4f16-802f-bee78032f6b4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 600.928066] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 600.928066] nova-compute[62208]: warnings.warn( [ 600.938144] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 600.938144] nova-compute[62208]: value = "task-38399" [ 600.938144] nova-compute[62208]: _type = "Task" [ 600.938144] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 600.941300] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 600.941300] nova-compute[62208]: warnings.warn( [ 600.949208] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38399, 'name': CreateVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 601.119428] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab0ff66b-2200-4613-a33e-2500f4357069 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.122279] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 601.122279] nova-compute[62208]: warnings.warn( [ 601.127687] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5c64a17-ea2d-4d2f-acd5-fbee12e08687 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.131278] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 601.131278] nova-compute[62208]: warnings.warn( [ 601.165189] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f4ef7f1-658a-4748-8bff-7abddf926ab9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.170270] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 601.170270] nova-compute[62208]: warnings.warn( [ 601.175990] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0970c82f-7220-4939-aa88-7b6593a5d6ea {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.180084] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 601.180084] nova-compute[62208]: warnings.warn( [ 601.192410] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 601.201759] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 601.230957] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.495s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 601.231470] nova-compute[62208]: DEBUG nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 601.276988] nova-compute[62208]: DEBUG nova.compute.utils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 601.278287] nova-compute[62208]: DEBUG nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 601.278452] nova-compute[62208]: DEBUG nova.network.neutron [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 601.291185] nova-compute[62208]: DEBUG nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 601.343964] nova-compute[62208]: DEBUG nova.policy [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '645c60dc2de54a11988865a841099f03', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e72b87e160434876b347900dae296606', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 601.388078] nova-compute[62208]: DEBUG nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 601.414244] nova-compute[62208]: DEBUG nova.virt.hardware [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:06:44Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='954869564',id=24,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-470451251',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 601.414517] nova-compute[62208]: DEBUG nova.virt.hardware [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 601.414706] nova-compute[62208]: DEBUG nova.virt.hardware [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 601.415025] nova-compute[62208]: DEBUG nova.virt.hardware [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 601.415297] nova-compute[62208]: DEBUG nova.virt.hardware [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 601.415550] nova-compute[62208]: DEBUG nova.virt.hardware [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 601.415790] nova-compute[62208]: DEBUG nova.virt.hardware [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 601.415951] nova-compute[62208]: DEBUG nova.virt.hardware [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 601.416184] nova-compute[62208]: DEBUG nova.virt.hardware [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 601.416369] nova-compute[62208]: DEBUG nova.virt.hardware [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 601.416602] nova-compute[62208]: DEBUG nova.virt.hardware [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 601.417592] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c106179-e87e-4b89-8c2f-b79581b376b8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.420163] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 601.420163] nova-compute[62208]: warnings.warn( [ 601.426593] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88ee6f46-ec93-4d63-969b-17059747ba74 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.430433] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 601.430433] nova-compute[62208]: warnings.warn( [ 601.443727] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 601.443727] nova-compute[62208]: warnings.warn( [ 601.449362] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38399, 'name': CreateVM_Task, 'duration_secs': 0.289777} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 601.449492] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 601.449826] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 601.450115] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 601.453219] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63d97b92-51b4-4a17-83db-2a2702a11425 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.463157] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 601.463157] nova-compute[62208]: warnings.warn( [ 601.484415] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Reconfiguring VM instance to enable vnc on port - 5908 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 601.484756] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-651e492e-3592-4c46-82fe-312ff064fc69 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.498488] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 601.498488] nova-compute[62208]: warnings.warn( [ 601.504303] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Waiting for the task: (returnval){ [ 601.504303] nova-compute[62208]: value = "task-38400" [ 601.504303] nova-compute[62208]: _type = "Task" [ 601.504303] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 601.507637] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 601.507637] nova-compute[62208]: warnings.warn( [ 601.515324] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Task: {'id': task-38400, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 602.008791] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 602.008791] nova-compute[62208]: warnings.warn( [ 602.014786] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Task: {'id': task-38400, 'name': ReconfigVM_Task, 'duration_secs': 0.113983} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 602.015072] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Reconfigured VM instance to enable vnc on port - 5908 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 602.015275] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.565s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 602.015996] nova-compute[62208]: DEBUG oslo_vmware.service [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8dc474e3-73f2-456f-8d32-6f130c4772ff {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.018427] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 602.018427] nova-compute[62208]: warnings.warn( [ 602.022719] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 602.022719] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Acquired lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 602.022719] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 602.022719] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9a23dfac-12cc-42d3-83bb-d86d2ad20035 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.024379] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 602.024379] nova-compute[62208]: warnings.warn( [ 602.027709] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Waiting for the task: (returnval){ [ 602.027709] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e06d29-c756-3136-d76f-a8bcdfa915be" [ 602.027709] nova-compute[62208]: _type = "Task" [ 602.027709] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 602.030933] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 602.030933] nova-compute[62208]: warnings.warn( [ 602.036574] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e06d29-c756-3136-d76f-a8bcdfa915be, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 602.362552] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Acquiring lock "e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 602.362552] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Lock "e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 602.531990] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 602.531990] nova-compute[62208]: warnings.warn( [ 602.543014] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Releasing lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 602.543624] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 602.544035] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 602.544349] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Acquired lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 602.544687] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 602.545210] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1089945f-ed43-4103-968a-106df0b835b0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.547821] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 602.547821] nova-compute[62208]: warnings.warn( [ 602.570652] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 602.570652] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 602.570652] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-223d69aa-2e6d-4bde-ab7f-d92e22256d86 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.570652] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 602.570652] nova-compute[62208]: warnings.warn( [ 602.574679] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ac281036-780c-4000-b1b6-ab1543381e30 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.577566] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 602.577566] nova-compute[62208]: warnings.warn( [ 602.583239] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Waiting for the task: (returnval){ [ 602.583239] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522e644e-5a39-ae1a-23e8-db863c685ef2" [ 602.583239] nova-compute[62208]: _type = "Task" [ 602.583239] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 602.586853] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 602.586853] nova-compute[62208]: warnings.warn( [ 602.598396] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522e644e-5a39-ae1a-23e8-db863c685ef2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 602.626698] nova-compute[62208]: DEBUG nova.network.neutron [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Successfully created port: 9b70a2b2-4247-4555-8bdb-11c1fe338eac {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 602.941900] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquiring lock "9f48db49-1618-4b04-88a6-315c0f9b889a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 602.942145] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Lock "9f48db49-1618-4b04-88a6-315c0f9b889a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 603.089096] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 603.089096] nova-compute[62208]: warnings.warn( [ 603.097456] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 603.097966] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Creating directory with path [datastore1] vmware_temp/a47cb348-a2f3-40fe-97a9-a2734b3084ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 603.098251] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6a41932b-cbd2-45af-a343-0980b98c9497 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.100336] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 603.100336] nova-compute[62208]: warnings.warn( [ 603.121418] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Created directory with path [datastore1] vmware_temp/a47cb348-a2f3-40fe-97a9-a2734b3084ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 603.121680] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Fetch image to [datastore1] vmware_temp/a47cb348-a2f3-40fe-97a9-a2734b3084ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 603.121849] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore1] vmware_temp/a47cb348-a2f3-40fe-97a9-a2734b3084ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore1 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 603.122668] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a9ae9f6-f968-42a1-be0c-42cf292e7f62 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.125377] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 603.125377] nova-compute[62208]: warnings.warn( [ 603.134357] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82715ec9-d9c0-4220-be7a-d19408e84a40 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.134695] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 603.134695] nova-compute[62208]: warnings.warn( [ 603.143212] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-041d2cf0-6ac2-4c2b-9c32-c07080c03424 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.147074] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 603.147074] nova-compute[62208]: warnings.warn( [ 603.182245] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3cd1dff-4335-4515-b142-aca607ffb9ed {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.184898] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 603.184898] nova-compute[62208]: warnings.warn( [ 603.192180] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c5b95b53-b986-4c98-8163-1bfbf1629a96 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.194779] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 603.194779] nova-compute[62208]: warnings.warn( [ 603.225412] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore1 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 603.287751] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a47cb348-a2f3-40fe-97a9-a2734b3084ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 603.359547] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 603.359841] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a47cb348-a2f3-40fe-97a9-a2734b3084ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 603.944599] nova-compute[62208]: DEBUG nova.network.neutron [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Successfully updated port: 9b70a2b2-4247-4555-8bdb-11c1fe338eac {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 603.958733] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Acquiring lock "refresh_cache-29ee4419-201b-4d3a-8e7b-c84a5eb36d1f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 603.958843] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Acquired lock "refresh_cache-29ee4419-201b-4d3a-8e7b-c84a5eb36d1f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 603.958989] nova-compute[62208]: DEBUG nova.network.neutron [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 604.059745] nova-compute[62208]: DEBUG nova.network.neutron [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 604.446227] nova-compute[62208]: DEBUG nova.network.neutron [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Updating instance_info_cache with network_info: [{"id": "9b70a2b2-4247-4555-8bdb-11c1fe338eac", "address": "fa:16:3e:2a:7b:e8", "network": {"id": "8a8a319f-9a98-446b-9b3e-1d18a5998259", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1516197475-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e72b87e160434876b347900dae296606", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d5970ab5-34b8-4065-bfa6-f568b8f103b7", "external-id": "nsx-vlan-transportzone-418", "segmentation_id": 418, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9b70a2b2-42", "ovs_interfaceid": "9b70a2b2-4247-4555-8bdb-11c1fe338eac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 604.466634] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Releasing lock "refresh_cache-29ee4419-201b-4d3a-8e7b-c84a5eb36d1f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 604.466949] nova-compute[62208]: DEBUG nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Instance network_info: |[{"id": "9b70a2b2-4247-4555-8bdb-11c1fe338eac", "address": "fa:16:3e:2a:7b:e8", "network": {"id": "8a8a319f-9a98-446b-9b3e-1d18a5998259", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1516197475-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e72b87e160434876b347900dae296606", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d5970ab5-34b8-4065-bfa6-f568b8f103b7", "external-id": "nsx-vlan-transportzone-418", "segmentation_id": 418, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9b70a2b2-42", "ovs_interfaceid": "9b70a2b2-4247-4555-8bdb-11c1fe338eac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 604.467355] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2a:7b:e8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd5970ab5-34b8-4065-bfa6-f568b8f103b7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9b70a2b2-4247-4555-8bdb-11c1fe338eac', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 604.478986] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Creating folder: Project (e72b87e160434876b347900dae296606). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 604.479715] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-70aa6e1a-00ff-4ebb-b2a3-eb548efa081b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 604.482313] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 604.482313] nova-compute[62208]: warnings.warn( [ 604.491076] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Created folder: Project (e72b87e160434876b347900dae296606) in parent group-v17427. [ 604.491285] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Creating folder: Instances. Parent ref: group-v17459. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 604.491530] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-afe5cd7e-2ba4-41a1-aa2a-49d7fd89e701 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 604.496078] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 604.496078] nova-compute[62208]: warnings.warn( [ 604.505392] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Created folder: Instances in parent group-v17459. [ 604.505685] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 604.505944] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 604.506347] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-72362d29-f31f-4893-8fde-ef80ac4816d2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 604.521616] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 604.521616] nova-compute[62208]: warnings.warn( [ 604.528271] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 604.528271] nova-compute[62208]: value = "task-38405" [ 604.528271] nova-compute[62208]: _type = "Task" [ 604.528271] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 604.531690] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 604.531690] nova-compute[62208]: warnings.warn( [ 604.540125] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38405, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 605.033034] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 605.033034] nova-compute[62208]: warnings.warn( [ 605.038961] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38405, 'name': CreateVM_Task, 'duration_secs': 0.397296} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 605.039164] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 605.039851] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 605.040253] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 605.042894] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6ae6bb8-2df9-47c7-a1a6-c4e1dc8366bf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 605.053473] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 605.053473] nova-compute[62208]: warnings.warn( [ 605.079068] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Reconfiguring VM instance to enable vnc on port - 5909 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 605.079470] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-8b7107e9-b52a-44b9-8373-5643ca2477b3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 605.092909] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 605.092909] nova-compute[62208]: warnings.warn( [ 605.099869] nova-compute[62208]: DEBUG oslo_vmware.api [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Waiting for the task: (returnval){ [ 605.099869] nova-compute[62208]: value = "task-38406" [ 605.099869] nova-compute[62208]: _type = "Task" [ 605.099869] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 605.104791] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 605.104791] nova-compute[62208]: warnings.warn( [ 605.113747] nova-compute[62208]: DEBUG oslo_vmware.api [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Task: {'id': task-38406, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 605.604296] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 605.604296] nova-compute[62208]: warnings.warn( [ 605.610076] nova-compute[62208]: DEBUG oslo_vmware.api [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Task: {'id': task-38406, 'name': ReconfigVM_Task, 'duration_secs': 0.132153} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 605.610365] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Reconfigured VM instance to enable vnc on port - 5909 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 605.610568] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.571s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 605.610807] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 605.610943] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Acquired lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 605.611296] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 605.611677] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-48cdd1e7-f466-4e75-8080-c9b8518d0fe8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 605.613218] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 605.613218] nova-compute[62208]: warnings.warn( [ 605.617097] nova-compute[62208]: DEBUG oslo_vmware.api [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Waiting for the task: (returnval){ [ 605.617097] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52de62f5-0753-cc62-0bbc-13ecb7c4aefd" [ 605.617097] nova-compute[62208]: _type = "Task" [ 605.617097] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 605.620353] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 605.620353] nova-compute[62208]: warnings.warn( [ 605.625639] nova-compute[62208]: DEBUG oslo_vmware.api [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52de62f5-0753-cc62-0bbc-13ecb7c4aefd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 605.693615] nova-compute[62208]: DEBUG nova.compute.manager [req-b87e683d-6baf-404b-ba00-bff9d8a6d020 req-df8b3a10-543a-4b6d-a49b-cfb64c4d7ab1 service nova] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Received event network-vif-plugged-9b70a2b2-4247-4555-8bdb-11c1fe338eac {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 605.693821] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b87e683d-6baf-404b-ba00-bff9d8a6d020 req-df8b3a10-543a-4b6d-a49b-cfb64c4d7ab1 service nova] Acquiring lock "29ee4419-201b-4d3a-8e7b-c84a5eb36d1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 605.694016] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b87e683d-6baf-404b-ba00-bff9d8a6d020 req-df8b3a10-543a-4b6d-a49b-cfb64c4d7ab1 service nova] Lock "29ee4419-201b-4d3a-8e7b-c84a5eb36d1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 605.694173] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b87e683d-6baf-404b-ba00-bff9d8a6d020 req-df8b3a10-543a-4b6d-a49b-cfb64c4d7ab1 service nova] Lock "29ee4419-201b-4d3a-8e7b-c84a5eb36d1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 605.694329] nova-compute[62208]: DEBUG nova.compute.manager [req-b87e683d-6baf-404b-ba00-bff9d8a6d020 req-df8b3a10-543a-4b6d-a49b-cfb64c4d7ab1 service nova] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] No waiting events found dispatching network-vif-plugged-9b70a2b2-4247-4555-8bdb-11c1fe338eac {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 605.694482] nova-compute[62208]: WARNING nova.compute.manager [req-b87e683d-6baf-404b-ba00-bff9d8a6d020 req-df8b3a10-543a-4b6d-a49b-cfb64c4d7ab1 service nova] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Received unexpected event network-vif-plugged-9b70a2b2-4247-4555-8bdb-11c1fe338eac for instance with vm_state building and task_state spawning. [ 606.121109] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 606.121109] nova-compute[62208]: warnings.warn( [ 606.127755] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Releasing lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 606.128055] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 606.128286] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 607.329384] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquiring lock "858b585c-7746-4d38-84c9-b3ee719eb406" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 607.330468] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Lock "858b585c-7746-4d38-84c9-b3ee719eb406" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 608.107499] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Acquiring lock "d2839fc7-a754-4d4f-a7f1-e4b17f2126a5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 608.107742] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Lock "d2839fc7-a754-4d4f-a7f1-e4b17f2126a5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 609.268941] nova-compute[62208]: DEBUG nova.compute.manager [req-a26feaa5-b944-4570-ad95-a30012bf5c76 req-8f523a26-2b41-4dc0-8ad5-f2d85be66bb0 service nova] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Received event network-changed-9b70a2b2-4247-4555-8bdb-11c1fe338eac {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 609.269194] nova-compute[62208]: DEBUG nova.compute.manager [req-a26feaa5-b944-4570-ad95-a30012bf5c76 req-8f523a26-2b41-4dc0-8ad5-f2d85be66bb0 service nova] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Refreshing instance network info cache due to event network-changed-9b70a2b2-4247-4555-8bdb-11c1fe338eac. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 609.269302] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-a26feaa5-b944-4570-ad95-a30012bf5c76 req-8f523a26-2b41-4dc0-8ad5-f2d85be66bb0 service nova] Acquiring lock "refresh_cache-29ee4419-201b-4d3a-8e7b-c84a5eb36d1f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 609.269455] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-a26feaa5-b944-4570-ad95-a30012bf5c76 req-8f523a26-2b41-4dc0-8ad5-f2d85be66bb0 service nova] Acquired lock "refresh_cache-29ee4419-201b-4d3a-8e7b-c84a5eb36d1f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 609.269617] nova-compute[62208]: DEBUG nova.network.neutron [req-a26feaa5-b944-4570-ad95-a30012bf5c76 req-8f523a26-2b41-4dc0-8ad5-f2d85be66bb0 service nova] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Refreshing network info cache for port 9b70a2b2-4247-4555-8bdb-11c1fe338eac {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 609.417224] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Acquiring lock "08336643-4254-4447-b7c2-b81054bf9707" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 609.417552] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Lock "08336643-4254-4447-b7c2-b81054bf9707" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 609.822518] nova-compute[62208]: DEBUG nova.network.neutron [req-a26feaa5-b944-4570-ad95-a30012bf5c76 req-8f523a26-2b41-4dc0-8ad5-f2d85be66bb0 service nova] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Updated VIF entry in instance network info cache for port 9b70a2b2-4247-4555-8bdb-11c1fe338eac. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 609.822881] nova-compute[62208]: DEBUG nova.network.neutron [req-a26feaa5-b944-4570-ad95-a30012bf5c76 req-8f523a26-2b41-4dc0-8ad5-f2d85be66bb0 service nova] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Updating instance_info_cache with network_info: [{"id": "9b70a2b2-4247-4555-8bdb-11c1fe338eac", "address": "fa:16:3e:2a:7b:e8", "network": {"id": "8a8a319f-9a98-446b-9b3e-1d18a5998259", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1516197475-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e72b87e160434876b347900dae296606", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d5970ab5-34b8-4065-bfa6-f568b8f103b7", "external-id": "nsx-vlan-transportzone-418", "segmentation_id": 418, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9b70a2b2-42", "ovs_interfaceid": "9b70a2b2-4247-4555-8bdb-11c1fe338eac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 609.836471] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-a26feaa5-b944-4570-ad95-a30012bf5c76 req-8f523a26-2b41-4dc0-8ad5-f2d85be66bb0 service nova] Releasing lock "refresh_cache-29ee4419-201b-4d3a-8e7b-c84a5eb36d1f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 610.248019] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquiring lock "af8885cb-afba-4724-be10-083e16f8bfc4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 610.248251] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "af8885cb-afba-4724-be10-083e16f8bfc4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server [None req-980f4688-8878-4ae9-a4d0-fdeb7ea4aaf4 tempest-HypervisorAdminTestJSON-968222684 tempest-HypervisorAdminTestJSON-968222684-project-admin] Exception during message handling: NotImplementedError: Multiple hosts may be managed by the VMWare vCenter driver; therefore we do not return uptime for just one host. [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 610.522479] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 610.523265] nova-compute[62208]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 610.523265] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 6779, in get_host_uptime [ 610.523265] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.driver.get_host_uptime() [ 610.523265] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 707, in get_host_uptime [ 610.523265] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise NotImplementedError(msg) [ 610.523265] nova-compute[62208]: ERROR oslo_messaging.rpc.server NotImplementedError: Multiple hosts may be managed by the VMWare vCenter driver; therefore we do not return uptime for just one host. [ 610.523265] nova-compute[62208]: ERROR oslo_messaging.rpc.server [ 610.878184] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Acquiring lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 610.878418] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 612.048858] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-651c8dda-94fb-4000-bf55-b1cc3f5a8f84 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquiring lock "d51c3719-bd80-4ad9-945c-c50e16fb3fd1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 612.049209] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-651c8dda-94fb-4000-bf55-b1cc3f5a8f84 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "d51c3719-bd80-4ad9-945c-c50e16fb3fd1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 613.598477] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4783afe8-73df-4225-b786-f99c264c99ba tempest-ServersTestBootFromVolume-245018572 tempest-ServersTestBootFromVolume-245018572-project-member] Acquiring lock "b58fe58a-9965-4f7e-808c-a5d004fd855e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 613.598776] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4783afe8-73df-4225-b786-f99c264c99ba tempest-ServersTestBootFromVolume-245018572 tempest-ServersTestBootFromVolume-245018572-project-member] Lock "b58fe58a-9965-4f7e-808c-a5d004fd855e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 615.213044] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 615.213044] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 615.213044] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 615.213044] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 615.213044] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 615.213044] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 615.213044] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 615.213044] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 615.213044] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 615.213044] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 615.213044] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 615.213044] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 615.213807] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/b9ec835c-8a00-47d7-bfe0-0d106b851120/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 615.214985] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 615.215221] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Copying Virtual Disk [datastore2] vmware_temp/b9ec835c-8a00-47d7-bfe0-0d106b851120/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/b9ec835c-8a00-47d7-bfe0-0d106b851120/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 615.215564] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b3395d5a-0901-473b-be28-275014ae7530 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.218209] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 615.218209] nova-compute[62208]: warnings.warn( [ 615.224208] nova-compute[62208]: DEBUG oslo_vmware.api [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Waiting for the task: (returnval){ [ 615.224208] nova-compute[62208]: value = "task-38409" [ 615.224208] nova-compute[62208]: _type = "Task" [ 615.224208] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 615.227643] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 615.227643] nova-compute[62208]: warnings.warn( [ 615.233695] nova-compute[62208]: DEBUG oslo_vmware.api [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Task: {'id': task-38409, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 615.680714] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b098499c-d98a-4bf0-8b5d-c1074793399a tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquiring lock "aae0a74f-3985-4a51-bae4-3b8124d7fe90" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 615.681463] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b098499c-d98a-4bf0-8b5d-c1074793399a tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "aae0a74f-3985-4a51-bae4-3b8124d7fe90" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 615.728847] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 615.728847] nova-compute[62208]: warnings.warn( [ 615.735820] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 615.736091] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 615.737713] nova-compute[62208]: ERROR nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 615.737713] nova-compute[62208]: Faults: ['InvalidArgument'] [ 615.737713] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Traceback (most recent call last): [ 615.737713] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 615.737713] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] yield resources [ 615.737713] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 615.737713] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] self.driver.spawn(context, instance, image_meta, [ 615.737713] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 615.737713] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] self._vmops.spawn(context, instance, image_meta, injected_files, [ 615.737713] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 615.737713] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] self._fetch_image_if_missing(context, vi) [ 615.737713] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 615.738097] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] image_cache(vi, tmp_image_ds_loc) [ 615.738097] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 615.738097] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] vm_util.copy_virtual_disk( [ 615.738097] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 615.738097] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] session._wait_for_task(vmdk_copy_task) [ 615.738097] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 615.738097] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] return self.wait_for_task(task_ref) [ 615.738097] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 615.738097] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] return evt.wait() [ 615.738097] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 615.738097] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] result = hub.switch() [ 615.738097] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 615.738097] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] return self.greenlet.switch() [ 615.738491] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 615.738491] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] self.f(*self.args, **self.kw) [ 615.738491] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 615.738491] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] raise exceptions.translate_fault(task_info.error) [ 615.738491] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 615.738491] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Faults: ['InvalidArgument'] [ 615.738491] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] [ 615.738491] nova-compute[62208]: INFO nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Terminating instance [ 615.739632] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 615.739843] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 615.740107] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9855f3f3-4192-4b6d-87ae-4f23d935887b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.742518] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Acquiring lock "refresh_cache-53e0e94e-e81c-44b0-bb52-18759172d614" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 615.742678] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Acquired lock "refresh_cache-53e0e94e-e81c-44b0-bb52-18759172d614" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 615.742843] nova-compute[62208]: DEBUG nova.network.neutron [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 615.743968] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 615.743968] nova-compute[62208]: warnings.warn( [ 615.750884] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 615.751123] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 615.752586] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6e9755d1-6408-4845-8acb-b5a347cc18fd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.764866] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 615.764866] nova-compute[62208]: warnings.warn( [ 615.769317] nova-compute[62208]: DEBUG oslo_vmware.api [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Waiting for the task: (returnval){ [ 615.769317] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522ce720-8d1c-ecc4-d264-c8eb4c0a4ab8" [ 615.769317] nova-compute[62208]: _type = "Task" [ 615.769317] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 615.772584] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 615.772584] nova-compute[62208]: warnings.warn( [ 615.778208] nova-compute[62208]: DEBUG oslo_vmware.api [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522ce720-8d1c-ecc4-d264-c8eb4c0a4ab8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 615.830595] nova-compute[62208]: DEBUG nova.network.neutron [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 615.886384] nova-compute[62208]: DEBUG nova.network.neutron [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 615.909205] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Releasing lock "refresh_cache-53e0e94e-e81c-44b0-bb52-18759172d614" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 615.909658] nova-compute[62208]: DEBUG nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 615.909864] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 615.911223] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f5146e6-ab89-4e97-a12f-cb7b3bfc6124 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.914326] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 615.914326] nova-compute[62208]: warnings.warn( [ 615.919903] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 615.920156] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fccd982b-8076-4b98-84f3-71f586b12e07 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.921664] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 615.921664] nova-compute[62208]: warnings.warn( [ 615.953355] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 615.953355] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 615.953355] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Deleting the datastore file [datastore2] 53e0e94e-e81c-44b0-bb52-18759172d614 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 615.953355] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a04d7c42-4e20-49a2-a558-643b6475dffa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.953355] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 615.953355] nova-compute[62208]: warnings.warn( [ 615.958716] nova-compute[62208]: DEBUG oslo_vmware.api [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Waiting for the task: (returnval){ [ 615.958716] nova-compute[62208]: value = "task-38411" [ 615.958716] nova-compute[62208]: _type = "Task" [ 615.958716] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 615.964319] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 615.964319] nova-compute[62208]: warnings.warn( [ 615.970060] nova-compute[62208]: DEBUG oslo_vmware.api [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Task: {'id': task-38411, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 616.273563] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 616.273563] nova-compute[62208]: warnings.warn( [ 616.280102] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 616.280344] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Creating directory with path [datastore2] vmware_temp/1f86ad74-c3b5-44dd-b22f-bed3bc4350cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 616.280643] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e60fe8b8-5577-46ca-b8f8-8783ebf56f20 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 616.286233] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 616.286233] nova-compute[62208]: warnings.warn( [ 616.287065] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Acquiring lock "c9b42581-3793-4641-be04-9a4b17b059cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 616.287380] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Lock "c9b42581-3793-4641-be04-9a4b17b059cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 616.296336] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Created directory with path [datastore2] vmware_temp/1f86ad74-c3b5-44dd-b22f-bed3bc4350cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 616.296532] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Fetch image to [datastore2] vmware_temp/1f86ad74-c3b5-44dd-b22f-bed3bc4350cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 616.296715] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/1f86ad74-c3b5-44dd-b22f-bed3bc4350cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 616.297522] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-921731f9-9b76-4abb-89de-0e7d1d42f0ec {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 616.300451] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 616.300451] nova-compute[62208]: warnings.warn( [ 616.306101] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b57e2273-fed4-4965-ba78-a83a679cc8d6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 616.308853] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 616.308853] nova-compute[62208]: warnings.warn( [ 616.317046] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c41ae30a-c4bd-4912-987d-beef8845eec7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 616.320946] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 616.320946] nova-compute[62208]: warnings.warn( [ 616.354216] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d4eaf0b-e803-494e-9093-7c7ed77f8169 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 616.358828] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 616.358828] nova-compute[62208]: warnings.warn( [ 616.366917] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-71f7352a-98a3-42d0-a0be-2dc46b65d659 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 616.368874] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 616.368874] nova-compute[62208]: warnings.warn( [ 616.393465] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 616.466599] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1f86ad74-c3b5-44dd-b22f-bed3bc4350cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 616.468987] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 616.468987] nova-compute[62208]: warnings.warn( [ 616.540520] nova-compute[62208]: DEBUG oslo_vmware.api [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Task: {'id': task-38411, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.049386} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 616.542089] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 616.542213] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 616.542768] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 616.542847] nova-compute[62208]: INFO nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Took 0.63 seconds to destroy the instance on the hypervisor. [ 616.543064] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 616.543475] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 616.543636] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1f86ad74-c3b5-44dd-b22f-bed3bc4350cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 616.543965] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Skipping network deallocation for instance since networking was not requested. {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2276}} [ 616.546355] nova-compute[62208]: DEBUG nova.compute.claims [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Aborting claim: <nova.compute.claims.Claim object at 0x7fb937783a00> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 616.546841] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 616.546928] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 616.995562] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55437ba1-ca05-4bfc-a145-03b0b164905c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 616.998091] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 616.998091] nova-compute[62208]: warnings.warn( [ 617.004820] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e1f014a-f31a-4f2b-b540-47c84c3f7809 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 617.007968] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 617.007968] nova-compute[62208]: warnings.warn( [ 617.040631] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3cab557-583e-4d7b-ad92-ad96d73a0f35 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 617.044694] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 617.044694] nova-compute[62208]: warnings.warn( [ 617.059631] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b343d22c-c3dd-4228-9ef7-b1520daa63ac {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 617.060368] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 617.060368] nova-compute[62208]: warnings.warn( [ 617.075798] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 617.085970] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 617.103466] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.556s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 617.104029] nova-compute[62208]: ERROR nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 617.104029] nova-compute[62208]: Faults: ['InvalidArgument'] [ 617.104029] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Traceback (most recent call last): [ 617.104029] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 617.104029] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] self.driver.spawn(context, instance, image_meta, [ 617.104029] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 617.104029] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] self._vmops.spawn(context, instance, image_meta, injected_files, [ 617.104029] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 617.104029] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] self._fetch_image_if_missing(context, vi) [ 617.104029] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 617.104029] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] image_cache(vi, tmp_image_ds_loc) [ 617.104029] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 617.104477] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] vm_util.copy_virtual_disk( [ 617.104477] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 617.104477] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] session._wait_for_task(vmdk_copy_task) [ 617.104477] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 617.104477] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] return self.wait_for_task(task_ref) [ 617.104477] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 617.104477] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] return evt.wait() [ 617.104477] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 617.104477] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] result = hub.switch() [ 617.104477] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 617.104477] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] return self.greenlet.switch() [ 617.104477] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 617.104477] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] self.f(*self.args, **self.kw) [ 617.104948] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 617.104948] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] raise exceptions.translate_fault(task_info.error) [ 617.104948] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 617.104948] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Faults: ['InvalidArgument'] [ 617.104948] nova-compute[62208]: ERROR nova.compute.manager [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] [ 617.104948] nova-compute[62208]: DEBUG nova.compute.utils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 617.106189] nova-compute[62208]: DEBUG nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Build of instance 53e0e94e-e81c-44b0-bb52-18759172d614 was re-scheduled: A specified parameter was not correct: fileType [ 617.106189] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 617.106569] nova-compute[62208]: DEBUG nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 617.106970] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Acquiring lock "refresh_cache-53e0e94e-e81c-44b0-bb52-18759172d614" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 617.106970] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Acquired lock "refresh_cache-53e0e94e-e81c-44b0-bb52-18759172d614" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 617.107089] nova-compute[62208]: DEBUG nova.network.neutron [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 617.148113] nova-compute[62208]: DEBUG nova.network.neutron [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 617.245382] nova-compute[62208]: DEBUG nova.network.neutron [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 617.255464] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Releasing lock "refresh_cache-53e0e94e-e81c-44b0-bb52-18759172d614" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 617.255743] nova-compute[62208]: DEBUG nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 617.255934] nova-compute[62208]: DEBUG nova.compute.manager [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] Skipping network deallocation for instance since networking was not requested. {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2276}} [ 617.366972] nova-compute[62208]: INFO nova.scheduler.client.report [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Deleted allocations for instance 53e0e94e-e81c-44b0-bb52-18759172d614 [ 617.390805] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-69d8231d-ef4b-4a0c-a4d9-a85b6c510fc7 tempest-ServersAdmin275Test-1365567481 tempest-ServersAdmin275Test-1365567481-project-member] Lock "53e0e94e-e81c-44b0-bb52-18759172d614" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 52.937s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 617.392018] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "53e0e94e-e81c-44b0-bb52-18759172d614" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 42.993s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 617.392192] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 53e0e94e-e81c-44b0-bb52-18759172d614] During sync_power_state the instance has a pending task (spawning). Skip. [ 617.392381] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "53e0e94e-e81c-44b0-bb52-18759172d614" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 617.420338] nova-compute[62208]: DEBUG nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 617.493501] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 617.493771] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 617.495601] nova-compute[62208]: INFO nova.compute.claims [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 617.861610] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9866be5-1dfb-4500-878a-1529f746bb69 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 617.864438] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 617.864438] nova-compute[62208]: warnings.warn( [ 617.869683] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d930e24f-e3ff-49f6-b718-18a0326f52fe {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 617.872857] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 617.872857] nova-compute[62208]: warnings.warn( [ 617.901218] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f745c440-7276-40ce-8bbd-21dce8611646 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 617.903722] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 617.903722] nova-compute[62208]: warnings.warn( [ 617.909945] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecd039b4-e6cf-4e2e-bab1-82551c56e196 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 617.913063] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 617.913063] nova-compute[62208]: warnings.warn( [ 617.923255] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 617.932312] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 617.949109] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.455s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 617.949624] nova-compute[62208]: DEBUG nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 618.002857] nova-compute[62208]: DEBUG nova.compute.utils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 618.002857] nova-compute[62208]: DEBUG nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 618.002857] nova-compute[62208]: DEBUG nova.network.neutron [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 618.016397] nova-compute[62208]: DEBUG nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 618.058926] nova-compute[62208]: DEBUG nova.policy [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fecf9c91988d46a5a54c805202ec50af', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '478071478a2242d594b642f8b543bbba', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 618.099907] nova-compute[62208]: DEBUG nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 618.129043] nova-compute[62208]: DEBUG nova.virt.hardware [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 618.129316] nova-compute[62208]: DEBUG nova.virt.hardware [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 618.129557] nova-compute[62208]: DEBUG nova.virt.hardware [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 618.129801] nova-compute[62208]: DEBUG nova.virt.hardware [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 618.129986] nova-compute[62208]: DEBUG nova.virt.hardware [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 618.130170] nova-compute[62208]: DEBUG nova.virt.hardware [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 618.130474] nova-compute[62208]: DEBUG nova.virt.hardware [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 618.130766] nova-compute[62208]: DEBUG nova.virt.hardware [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 618.131022] nova-compute[62208]: DEBUG nova.virt.hardware [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 618.131306] nova-compute[62208]: DEBUG nova.virt.hardware [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 618.131528] nova-compute[62208]: DEBUG nova.virt.hardware [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 618.133222] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c932b52-1f8e-4fdb-9f00-635eb9257d30 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 618.137094] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 618.137094] nova-compute[62208]: warnings.warn( [ 618.143243] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cc2a0c0-d4f8-41b4-a4f6-81fce6301744 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 618.150201] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 618.150201] nova-compute[62208]: warnings.warn( [ 618.434454] nova-compute[62208]: DEBUG nova.network.neutron [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Successfully created port: 3546727b-2dca-4825-9fda-6a541081d212 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 619.143693] nova-compute[62208]: DEBUG nova.network.neutron [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Successfully updated port: 3546727b-2dca-4825-9fda-6a541081d212 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 619.154362] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Acquiring lock "refresh_cache-e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 619.154362] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Acquired lock "refresh_cache-e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 619.154362] nova-compute[62208]: DEBUG nova.network.neutron [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 619.212735] nova-compute[62208]: DEBUG nova.network.neutron [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 619.485988] nova-compute[62208]: DEBUG nova.network.neutron [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Updating instance_info_cache with network_info: [{"id": "3546727b-2dca-4825-9fda-6a541081d212", "address": "fa:16:3e:90:4c:54", "network": {"id": "0c0f14d0-9432-4b66-ba2e-4a8432f02762", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1578079769-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "478071478a2242d594b642f8b543bbba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3546727b-2d", "ovs_interfaceid": "3546727b-2dca-4825-9fda-6a541081d212", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 619.506352] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Releasing lock "refresh_cache-e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 619.506352] nova-compute[62208]: DEBUG nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Instance network_info: |[{"id": "3546727b-2dca-4825-9fda-6a541081d212", "address": "fa:16:3e:90:4c:54", "network": {"id": "0c0f14d0-9432-4b66-ba2e-4a8432f02762", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1578079769-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "478071478a2242d594b642f8b543bbba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3546727b-2d", "ovs_interfaceid": "3546727b-2dca-4825-9fda-6a541081d212", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 619.506803] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:90:4c:54', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3546727b-2dca-4825-9fda-6a541081d212', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 619.519883] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Creating folder: Project (478071478a2242d594b642f8b543bbba). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 619.520871] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f2156028-79f1-409c-803d-a56b407a925c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 619.522737] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 619.522737] nova-compute[62208]: warnings.warn( [ 619.533007] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Created folder: Project (478071478a2242d594b642f8b543bbba) in parent group-v17427. [ 619.533687] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Creating folder: Instances. Parent ref: group-v17462. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 619.534680] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7d783321-0b8e-438c-9266-b01e508ca6ec {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 619.536724] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 619.536724] nova-compute[62208]: warnings.warn( [ 619.548720] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Created folder: Instances in parent group-v17462. [ 619.549000] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 619.549206] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 619.549417] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-de4e91d9-d3a9-4464-8052-065c985a90ba {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 619.564187] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 619.564187] nova-compute[62208]: warnings.warn( [ 619.570224] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 619.570224] nova-compute[62208]: value = "task-38414" [ 619.570224] nova-compute[62208]: _type = "Task" [ 619.570224] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 619.576063] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 619.576063] nova-compute[62208]: warnings.warn( [ 619.578992] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38414, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 619.849753] nova-compute[62208]: DEBUG nova.compute.manager [req-b6beb97f-d953-49f5-aaa5-2d14e231f54c req-ba063e1b-97c3-404f-b6b8-48c25d3f53f8 service nova] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Received event network-vif-plugged-3546727b-2dca-4825-9fda-6a541081d212 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 619.849970] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b6beb97f-d953-49f5-aaa5-2d14e231f54c req-ba063e1b-97c3-404f-b6b8-48c25d3f53f8 service nova] Acquiring lock "e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 619.850178] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b6beb97f-d953-49f5-aaa5-2d14e231f54c req-ba063e1b-97c3-404f-b6b8-48c25d3f53f8 service nova] Lock "e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 619.850342] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b6beb97f-d953-49f5-aaa5-2d14e231f54c req-ba063e1b-97c3-404f-b6b8-48c25d3f53f8 service nova] Lock "e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 619.850506] nova-compute[62208]: DEBUG nova.compute.manager [req-b6beb97f-d953-49f5-aaa5-2d14e231f54c req-ba063e1b-97c3-404f-b6b8-48c25d3f53f8 service nova] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] No waiting events found dispatching network-vif-plugged-3546727b-2dca-4825-9fda-6a541081d212 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 619.850667] nova-compute[62208]: WARNING nova.compute.manager [req-b6beb97f-d953-49f5-aaa5-2d14e231f54c req-ba063e1b-97c3-404f-b6b8-48c25d3f53f8 service nova] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Received unexpected event network-vif-plugged-3546727b-2dca-4825-9fda-6a541081d212 for instance with vm_state building and task_state spawning. [ 620.074611] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 620.074611] nova-compute[62208]: warnings.warn( [ 620.081390] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38414, 'name': CreateVM_Task, 'duration_secs': 0.355895} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 620.081694] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 620.082192] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 620.082424] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 620.085523] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-542a105b-ecc9-4b6b-a43d-27366de90700 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.098671] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 620.098671] nova-compute[62208]: warnings.warn( [ 620.122969] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Reconfiguring VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 620.123379] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-783d8de0-e050-49f0-954e-caf83e711759 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.133836] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 620.133836] nova-compute[62208]: warnings.warn( [ 620.141616] nova-compute[62208]: DEBUG oslo_vmware.api [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Waiting for the task: (returnval){ [ 620.141616] nova-compute[62208]: value = "task-38415" [ 620.141616] nova-compute[62208]: _type = "Task" [ 620.141616] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 620.144426] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 620.144426] nova-compute[62208]: warnings.warn( [ 620.152325] nova-compute[62208]: DEBUG oslo_vmware.api [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Task: {'id': task-38415, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 620.645208] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 620.645208] nova-compute[62208]: warnings.warn( [ 620.651776] nova-compute[62208]: DEBUG oslo_vmware.api [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Task: {'id': task-38415, 'name': ReconfigVM_Task, 'duration_secs': 0.124941} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 620.652090] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Reconfigured VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 620.652295] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.570s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 620.652550] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 620.652695] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Acquired lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 620.653022] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 620.653293] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-93cc1d41-1273-4cbf-869d-5c57a072109a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.655005] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 620.655005] nova-compute[62208]: warnings.warn( [ 620.659102] nova-compute[62208]: DEBUG oslo_vmware.api [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Waiting for the task: (returnval){ [ 620.659102] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522d4e6b-dfa2-e8fe-5207-f878f1bbade3" [ 620.659102] nova-compute[62208]: _type = "Task" [ 620.659102] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 620.662401] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 620.662401] nova-compute[62208]: warnings.warn( [ 620.669364] nova-compute[62208]: DEBUG oslo_vmware.api [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522d4e6b-dfa2-e8fe-5207-f878f1bbade3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 621.162904] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 621.162904] nova-compute[62208]: warnings.warn( [ 621.169966] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Releasing lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 621.170242] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 621.170447] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 622.575078] nova-compute[62208]: DEBUG nova.compute.manager [req-c6ecfdfa-4864-4cf7-89b0-c2bce11e6c4a req-a6edfb39-8f0c-47c4-98fd-12e87a13cc97 service nova] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Received event network-changed-3546727b-2dca-4825-9fda-6a541081d212 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 622.575374] nova-compute[62208]: DEBUG nova.compute.manager [req-c6ecfdfa-4864-4cf7-89b0-c2bce11e6c4a req-a6edfb39-8f0c-47c4-98fd-12e87a13cc97 service nova] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Refreshing instance network info cache due to event network-changed-3546727b-2dca-4825-9fda-6a541081d212. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 622.575469] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-c6ecfdfa-4864-4cf7-89b0-c2bce11e6c4a req-a6edfb39-8f0c-47c4-98fd-12e87a13cc97 service nova] Acquiring lock "refresh_cache-e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 622.575615] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-c6ecfdfa-4864-4cf7-89b0-c2bce11e6c4a req-a6edfb39-8f0c-47c4-98fd-12e87a13cc97 service nova] Acquired lock "refresh_cache-e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 622.575779] nova-compute[62208]: DEBUG nova.network.neutron [req-c6ecfdfa-4864-4cf7-89b0-c2bce11e6c4a req-a6edfb39-8f0c-47c4-98fd-12e87a13cc97 service nova] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Refreshing network info cache for port 3546727b-2dca-4825-9fda-6a541081d212 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 623.141906] nova-compute[62208]: DEBUG nova.network.neutron [req-c6ecfdfa-4864-4cf7-89b0-c2bce11e6c4a req-a6edfb39-8f0c-47c4-98fd-12e87a13cc97 service nova] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Updated VIF entry in instance network info cache for port 3546727b-2dca-4825-9fda-6a541081d212. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 623.142261] nova-compute[62208]: DEBUG nova.network.neutron [req-c6ecfdfa-4864-4cf7-89b0-c2bce11e6c4a req-a6edfb39-8f0c-47c4-98fd-12e87a13cc97 service nova] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Updating instance_info_cache with network_info: [{"id": "3546727b-2dca-4825-9fda-6a541081d212", "address": "fa:16:3e:90:4c:54", "network": {"id": "0c0f14d0-9432-4b66-ba2e-4a8432f02762", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1578079769-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "478071478a2242d594b642f8b543bbba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3546727b-2d", "ovs_interfaceid": "3546727b-2dca-4825-9fda-6a541081d212", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 623.155099] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-c6ecfdfa-4864-4cf7-89b0-c2bce11e6c4a req-a6edfb39-8f0c-47c4-98fd-12e87a13cc97 service nova] Releasing lock "refresh_cache-e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 625.044167] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Acquiring lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 625.044541] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 631.615896] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 631.649648] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 631.649986] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 631.650197] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 632.141100] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.141368] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 632.141505] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 632.166391] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 632.166798] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 632.166798] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 632.166798] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 632.166918] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 632.167576] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 632.167576] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 632.167576] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 632.167576] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 632.167576] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 632.167807] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 632.168362] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.168694] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.168933] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.169143] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.270797] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-bc94d813-3d3c-40ce-8102-b6786b8ce74d tempest-ListImageFiltersTestJSON-818600066 tempest-ListImageFiltersTestJSON-818600066-project-member] Acquiring lock "61e911d7-b8e9-416e-b73c-574768744974" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 632.270970] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-bc94d813-3d3c-40ce-8102-b6786b8ce74d tempest-ListImageFiltersTestJSON-818600066 tempest-ListImageFiltersTestJSON-818600066-project-member] Lock "61e911d7-b8e9-416e-b73c-574768744974" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 633.141059] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 633.141422] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 633.153707] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 633.153941] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 633.154111] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 633.154265] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 633.155395] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9563ed2c-6f93-4dd5-8eb2-cff6e5bce110 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.158680] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 633.158680] nova-compute[62208]: warnings.warn( [ 633.165224] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d2fb39d-1819-4d53-9b26-488a21eda2ca {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.173372] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 633.173372] nova-compute[62208]: warnings.warn( [ 633.193053] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd15b2c5-fe9e-4036-9118-99a74896ace5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.195748] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 633.195748] nova-compute[62208]: warnings.warn( [ 633.201714] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f3d0fca-ca71-44b9-801a-10883dc8d051 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.206126] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 633.206126] nova-compute[62208]: warnings.warn( [ 633.239352] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181937MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 633.239625] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 633.239971] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 633.338700] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f7e43c56-e126-4e5a-944a-bba89f2f9744 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 633.338959] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f9954bd1-8df3-445c-bb4c-ee316b7b0447 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 633.339076] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 99698b8b-8a66-46ce-8bf1-cc00239e644b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 633.339232] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 2e938efc-55d2-4116-8989-354ec339579f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 633.339386] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 75fd2a2c-4ef5-4b42-b309-53cff148c772 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 633.339539] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4ca19153-519c-49e3-bdfd-1f5ea77b24a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 633.339747] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bd0eef47-56e8-45b6-92b1-e81400994572 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 633.340136] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance aeb4f446-da3f-4422-a3cd-013b8d6dd174 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 633.340341] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 633.341469] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 633.381920] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9f48db49-1618-4b04-88a6-315c0f9b889a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 633.423026] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 858b585c-7746-4d38-84c9-b3ee719eb406 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 633.447104] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d2839fc7-a754-4d4f-a7f1-e4b17f2126a5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 633.469091] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 08336643-4254-4447-b7c2-b81054bf9707 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 633.483221] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance af8885cb-afba-4724-be10-083e16f8bfc4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 633.503866] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 633.523731] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d51c3719-bd80-4ad9-945c-c50e16fb3fd1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 633.544881] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b58fe58a-9965-4f7e-808c-a5d004fd855e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 633.558288] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance aae0a74f-3985-4a51-bae4-3b8124d7fe90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 633.575062] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c9b42581-3793-4641-be04-9a4b17b059cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 633.591769] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 633.610848] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 61e911d7-b8e9-416e-b73c-574768744974 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 633.611279] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 633.611657] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 633.624700] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4487296c-1b8c-4d90-abc0-e7ad236d5263 tempest-ListImageFiltersTestJSON-818600066 tempest-ListImageFiltersTestJSON-818600066-project-member] Acquiring lock "f5f7e84c-2d39-4929-be15-e7c03fae4319" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 633.626024] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4487296c-1b8c-4d90-abc0-e7ad236d5263 tempest-ListImageFiltersTestJSON-818600066 tempest-ListImageFiltersTestJSON-818600066-project-member] Lock "f5f7e84c-2d39-4929-be15-e7c03fae4319" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 634.054017] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bc5f9db-e137-4f9e-b788-0325e6731dd4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 634.056967] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 634.056967] nova-compute[62208]: warnings.warn( [ 634.062637] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-948bcb82-ed91-447e-a5de-3bc6e9fb1650 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 634.065682] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 634.065682] nova-compute[62208]: warnings.warn( [ 634.094727] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88965a80-7e5c-46d1-aec3-b1dda4b18aa9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 634.097791] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 634.097791] nova-compute[62208]: warnings.warn( [ 634.105559] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c12e5ba-eb1c-4e23-8d90-892845e89417 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 634.109475] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 634.109475] nova-compute[62208]: warnings.warn( [ 634.120740] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 634.134915] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 634.159745] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 634.160083] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.920s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 634.651580] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c0227e88-29e7-4841-981b-4a5deb2ddf23 tempest-ServerActionsTestOtherA-1944985306 tempest-ServerActionsTestOtherA-1944985306-project-member] Acquiring lock "d4e0170a-0993-4f7f-a7fa-6539bb13a082" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 634.651811] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c0227e88-29e7-4841-981b-4a5deb2ddf23 tempest-ServerActionsTestOtherA-1944985306 tempest-ServerActionsTestOtherA-1944985306-project-member] Lock "d4e0170a-0993-4f7f-a7fa-6539bb13a082" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 637.162409] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ad115aa5-fb16-42af-b099-d0e79a132f92 tempest-VolumesAdminNegativeTest-420285072 tempest-VolumesAdminNegativeTest-420285072-project-member] Acquiring lock "2681fbe1-7ed8-4280-95ac-f98063278b52" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 637.162409] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ad115aa5-fb16-42af-b099-d0e79a132f92 tempest-VolumesAdminNegativeTest-420285072 tempest-VolumesAdminNegativeTest-420285072-project-member] Lock "2681fbe1-7ed8-4280-95ac-f98063278b52" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 643.311195] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7ff5b5ec-d8a8-482a-9151-921c71086c10 tempest-AttachInterfacesUnderV243Test-662403438 tempest-AttachInterfacesUnderV243Test-662403438-project-member] Acquiring lock "3be03d93-aae7-4312-832f-5a61b49753bb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 643.311572] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7ff5b5ec-d8a8-482a-9151-921c71086c10 tempest-AttachInterfacesUnderV243Test-662403438 tempest-AttachInterfacesUnderV243Test-662403438-project-member] Lock "3be03d93-aae7-4312-832f-5a61b49753bb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 644.979827] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-91f2127d-27e2-401b-be3d-e4848189e2e1 tempest-AttachInterfacesV270Test-1164794761 tempest-AttachInterfacesV270Test-1164794761-project-member] Acquiring lock "78471258-10a4-42e2-8d2a-f30b2baaa5d9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 644.980123] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-91f2127d-27e2-401b-be3d-e4848189e2e1 tempest-AttachInterfacesV270Test-1164794761 tempest-AttachInterfacesV270Test-1164794761-project-member] Lock "78471258-10a4-42e2-8d2a-f30b2baaa5d9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 645.013492] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-abb41f28-a102-4838-9aef-3889631c6135 tempest-AttachSCSIVolumeTestJSON-1879792403 tempest-AttachSCSIVolumeTestJSON-1879792403-project-member] Acquiring lock "5d6be180-d89f-44ba-847e-0ea169316d90" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 645.013849] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-abb41f28-a102-4838-9aef-3889631c6135 tempest-AttachSCSIVolumeTestJSON-1879792403 tempest-AttachSCSIVolumeTestJSON-1879792403-project-member] Lock "5d6be180-d89f-44ba-847e-0ea169316d90" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 649.500199] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 649.500199] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 649.500199] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 649.500199] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 649.500199] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 649.500199] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 649.500199] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 649.500199] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 649.500199] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 649.500199] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 649.500199] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 649.500199] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 649.500882] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/a47cb348-a2f3-40fe-97a9-a2734b3084ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore1 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 649.501939] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 649.502193] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Copying Virtual Disk [datastore1] vmware_temp/a47cb348-a2f3-40fe-97a9-a2734b3084ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore1] vmware_temp/a47cb348-a2f3-40fe-97a9-a2734b3084ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 649.502499] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d3297e17-3b63-4594-93ff-04bc9eb138ca {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.507447] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 649.507447] nova-compute[62208]: warnings.warn( [ 649.514722] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Waiting for the task: (returnval){ [ 649.514722] nova-compute[62208]: value = "task-38422" [ 649.514722] nova-compute[62208]: _type = "Task" [ 649.514722] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 649.521424] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 649.521424] nova-compute[62208]: warnings.warn( [ 649.527968] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Task: {'id': task-38422, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 650.020578] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.020578] nova-compute[62208]: warnings.warn( [ 650.027618] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 650.027863] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Releasing lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 650.028417] nova-compute[62208]: ERROR nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 650.028417] nova-compute[62208]: Faults: ['InvalidArgument'] [ 650.028417] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Traceback (most recent call last): [ 650.028417] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 650.028417] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] yield resources [ 650.028417] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 650.028417] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] self.driver.spawn(context, instance, image_meta, [ 650.028417] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 650.028417] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] self._vmops.spawn(context, instance, image_meta, injected_files, [ 650.028417] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 650.028417] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] self._fetch_image_if_missing(context, vi) [ 650.028417] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 650.028842] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] image_cache(vi, tmp_image_ds_loc) [ 650.028842] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 650.028842] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] vm_util.copy_virtual_disk( [ 650.028842] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 650.028842] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] session._wait_for_task(vmdk_copy_task) [ 650.028842] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 650.028842] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] return self.wait_for_task(task_ref) [ 650.028842] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 650.028842] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] return evt.wait() [ 650.028842] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 650.028842] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] result = hub.switch() [ 650.028842] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 650.028842] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] return self.greenlet.switch() [ 650.029237] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 650.029237] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] self.f(*self.args, **self.kw) [ 650.029237] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 650.029237] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] raise exceptions.translate_fault(task_info.error) [ 650.029237] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 650.029237] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Faults: ['InvalidArgument'] [ 650.029237] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] [ 650.029237] nova-compute[62208]: INFO nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Terminating instance [ 650.030458] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Acquired lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 650.030628] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 650.031160] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Acquiring lock "refresh_cache-aeb4f446-da3f-4422-a3cd-013b8d6dd174" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 650.031305] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Acquired lock "refresh_cache-aeb4f446-da3f-4422-a3cd-013b8d6dd174" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 650.031538] nova-compute[62208]: DEBUG nova.network.neutron [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 650.032629] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-26914e16-1b5a-4281-945b-ce257e1c4cef {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.034953] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.034953] nova-compute[62208]: warnings.warn( [ 650.044835] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 650.044835] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 650.046204] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-46594609-a4ea-478b-bae1-714ef00ecee5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.050820] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.050820] nova-compute[62208]: warnings.warn( [ 650.055046] nova-compute[62208]: DEBUG oslo_vmware.api [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Waiting for the task: (returnval){ [ 650.055046] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5253176e-3c2d-7835-2b2d-fc64e9dba93b" [ 650.055046] nova-compute[62208]: _type = "Task" [ 650.055046] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 650.058789] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.058789] nova-compute[62208]: warnings.warn( [ 650.069067] nova-compute[62208]: DEBUG oslo_vmware.api [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5253176e-3c2d-7835-2b2d-fc64e9dba93b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 650.081157] nova-compute[62208]: DEBUG nova.network.neutron [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 650.133518] nova-compute[62208]: DEBUG nova.network.neutron [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 650.144038] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Releasing lock "refresh_cache-aeb4f446-da3f-4422-a3cd-013b8d6dd174" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 650.144038] nova-compute[62208]: DEBUG nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 650.144038] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 650.144038] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-849cb64a-381a-400f-a67a-138c0f2e13bc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.147018] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.147018] nova-compute[62208]: warnings.warn( [ 650.152418] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 650.153128] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1b765036-2a0f-45eb-b931-de4050770ddf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.154748] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.154748] nova-compute[62208]: warnings.warn( [ 650.182209] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 650.182954] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Deleting contents of the VM from datastore datastore1 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 650.183197] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Deleting the datastore file [datastore1] aeb4f446-da3f-4422-a3cd-013b8d6dd174 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 650.183554] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6ad03782-fb41-42ed-8ea4-1b1cc6321295 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.185546] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.185546] nova-compute[62208]: warnings.warn( [ 650.191324] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Waiting for the task: (returnval){ [ 650.191324] nova-compute[62208]: value = "task-38424" [ 650.191324] nova-compute[62208]: _type = "Task" [ 650.191324] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 650.196879] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.196879] nova-compute[62208]: warnings.warn( [ 650.202793] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Task: {'id': task-38424, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 650.559364] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.559364] nova-compute[62208]: warnings.warn( [ 650.565932] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 650.566237] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Creating directory with path [datastore1] vmware_temp/d913c11c-e3c6-41a0-b3f6-ec9f53f7e4e3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 650.566515] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8b4ab896-a2fa-45f1-a65e-28b616ce4e6f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.569348] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.569348] nova-compute[62208]: warnings.warn( [ 650.583950] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Created directory with path [datastore1] vmware_temp/d913c11c-e3c6-41a0-b3f6-ec9f53f7e4e3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 650.584294] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Fetch image to [datastore1] vmware_temp/d913c11c-e3c6-41a0-b3f6-ec9f53f7e4e3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 650.584515] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore1] vmware_temp/d913c11c-e3c6-41a0-b3f6-ec9f53f7e4e3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore1 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 650.585778] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c83c2c26-1672-422c-a40c-d6cb5474f117 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.591467] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.591467] nova-compute[62208]: warnings.warn( [ 650.600380] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f968eb8-5e9b-4d23-be09-4bbe527bb6b1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.602782] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.602782] nova-compute[62208]: warnings.warn( [ 650.610830] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7adfb912-d612-4b3c-aa28-787f95bf1557 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.620597] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.620597] nova-compute[62208]: warnings.warn( [ 650.649724] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29fc1b40-5e9f-450f-89c4-21a2071a5d5e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.652280] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.652280] nova-compute[62208]: warnings.warn( [ 650.656480] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-beec3d8d-9d35-4e9d-833b-3f9a64dc2744 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.658650] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.658650] nova-compute[62208]: warnings.warn( [ 650.691596] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore1 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 650.697084] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 650.697084] nova-compute[62208]: warnings.warn( [ 650.703922] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Task: {'id': task-38424, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.046491} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 650.704216] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 650.704402] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Deleted contents of the VM from datastore datastore1 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 650.704571] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 650.704741] nova-compute[62208]: INFO nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Took 0.56 seconds to destroy the instance on the hypervisor. [ 650.705183] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 650.705418] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Skipping network deallocation for instance since networking was not requested. {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2276}} [ 650.708300] nova-compute[62208]: DEBUG nova.compute.claims [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9373929b0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 650.708464] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 650.708682] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 650.762220] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d913c11c-e3c6-41a0-b3f6-ec9f53f7e4e3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 650.840985] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 650.840985] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d913c11c-e3c6-41a0-b3f6-ec9f53f7e4e3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 651.326014] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af7a2a96-726b-4960-b98f-4dbe63e388cb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.328805] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 651.328805] nova-compute[62208]: warnings.warn( [ 651.334856] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57954aec-8c06-4a07-b6f5-cde72d7194a9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.338115] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 651.338115] nova-compute[62208]: warnings.warn( [ 651.371914] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdfc3ed0-2a76-40d1-b823-162b58a692a7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.376399] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 651.376399] nova-compute[62208]: warnings.warn( [ 651.383193] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20831273-d011-4106-833b-648e3a672c77 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.387846] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 651.387846] nova-compute[62208]: warnings.warn( [ 651.400874] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 651.411654] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 651.439985] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.731s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 651.440619] nova-compute[62208]: ERROR nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 651.440619] nova-compute[62208]: Faults: ['InvalidArgument'] [ 651.440619] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Traceback (most recent call last): [ 651.440619] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 651.440619] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] self.driver.spawn(context, instance, image_meta, [ 651.440619] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 651.440619] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] self._vmops.spawn(context, instance, image_meta, injected_files, [ 651.440619] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 651.440619] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] self._fetch_image_if_missing(context, vi) [ 651.440619] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 651.440619] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] image_cache(vi, tmp_image_ds_loc) [ 651.440619] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 651.441053] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] vm_util.copy_virtual_disk( [ 651.441053] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 651.441053] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] session._wait_for_task(vmdk_copy_task) [ 651.441053] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 651.441053] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] return self.wait_for_task(task_ref) [ 651.441053] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 651.441053] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] return evt.wait() [ 651.441053] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 651.441053] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] result = hub.switch() [ 651.441053] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 651.441053] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] return self.greenlet.switch() [ 651.441053] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 651.441053] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] self.f(*self.args, **self.kw) [ 651.441629] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 651.441629] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] raise exceptions.translate_fault(task_info.error) [ 651.441629] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 651.441629] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Faults: ['InvalidArgument'] [ 651.441629] nova-compute[62208]: ERROR nova.compute.manager [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] [ 651.441814] nova-compute[62208]: DEBUG nova.compute.utils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 651.444180] nova-compute[62208]: DEBUG nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Build of instance aeb4f446-da3f-4422-a3cd-013b8d6dd174 was re-scheduled: A specified parameter was not correct: fileType [ 651.444180] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 651.444635] nova-compute[62208]: DEBUG nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 651.444870] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Acquiring lock "refresh_cache-aeb4f446-da3f-4422-a3cd-013b8d6dd174" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 651.445012] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Acquired lock "refresh_cache-aeb4f446-da3f-4422-a3cd-013b8d6dd174" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 651.445168] nova-compute[62208]: DEBUG nova.network.neutron [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 651.483781] nova-compute[62208]: DEBUG nova.network.neutron [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 651.537658] nova-compute[62208]: DEBUG nova.network.neutron [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 651.554336] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Releasing lock "refresh_cache-aeb4f446-da3f-4422-a3cd-013b8d6dd174" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 651.554922] nova-compute[62208]: DEBUG nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 651.555224] nova-compute[62208]: DEBUG nova.compute.manager [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] [instance: aeb4f446-da3f-4422-a3cd-013b8d6dd174] Skipping network deallocation for instance since networking was not requested. {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2276}} [ 651.667809] nova-compute[62208]: INFO nova.scheduler.client.report [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Deleted allocations for instance aeb4f446-da3f-4422-a3cd-013b8d6dd174 [ 651.694352] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c9f8bdef-3037-4808-a1e8-4c2345804f2c tempest-ServersAaction247Test-513046594 tempest-ServersAaction247Test-513046594-project-member] Lock "aeb4f446-da3f-4422-a3cd-013b8d6dd174" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 51.765s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 651.715226] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 651.792874] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 651.793270] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 651.794857] nova-compute[62208]: INFO nova.compute.claims [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 652.102333] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7fdd710c-95ab-4a98-8cfb-f11c3ef229ad tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] Acquiring lock "4474e61b-0664-40f7-a8ec-be3d14684b10" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 652.103066] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7fdd710c-95ab-4a98-8cfb-f11c3ef229ad tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] Lock "4474e61b-0664-40f7-a8ec-be3d14684b10" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 652.152754] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7fdd710c-95ab-4a98-8cfb-f11c3ef229ad tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] Acquiring lock "d1a12f05-6178-44bb-9eb0-b52d806fe91d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 652.153036] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7fdd710c-95ab-4a98-8cfb-f11c3ef229ad tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] Lock "d1a12f05-6178-44bb-9eb0-b52d806fe91d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 652.297451] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-947d07cc-f34a-46d5-911b-5872ac806e2d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.300513] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 652.300513] nova-compute[62208]: warnings.warn( [ 652.306383] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43beffc2-2816-4e30-9505-3dadebcd7132 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.309614] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 652.309614] nova-compute[62208]: warnings.warn( [ 652.338099] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f9c3e82-315b-4b7b-8a49-838c9a2fef88 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.341176] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 652.341176] nova-compute[62208]: warnings.warn( [ 652.347290] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecb37be0-0e46-45ba-9d00-b20e629313cf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.351454] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 652.351454] nova-compute[62208]: warnings.warn( [ 652.361814] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 652.370406] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 652.388378] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.595s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 652.388899] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 652.459554] nova-compute[62208]: DEBUG nova.compute.utils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 652.461770] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Not allocating networking since 'none' was specified. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1968}} [ 652.491714] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 652.731709] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 652.756975] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 652.757161] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 652.757307] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 652.757485] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 652.757664] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 652.757825] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 652.758031] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 652.758191] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 652.758362] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 652.758586] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 652.759494] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 652.761325] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b27cfc2e-5b5c-49e4-acde-00dd58404e9c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.764249] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 652.764249] nova-compute[62208]: warnings.warn( [ 652.771738] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0ca9aaa-d277-4ee4-a2ea-9eb530329185 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.775882] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 652.775882] nova-compute[62208]: warnings.warn( [ 652.792271] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 652.797876] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Creating folder: Project (4f0be149cc3843079db0b2cea8c39ecf). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 652.798224] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f20ba77b-56d9-482c-91f3-a7b871c72ef5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.800074] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 652.800074] nova-compute[62208]: warnings.warn( [ 652.810496] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Created folder: Project (4f0be149cc3843079db0b2cea8c39ecf) in parent group-v17427. [ 652.810751] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Creating folder: Instances. Parent ref: group-v17469. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 652.810963] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7f55e85a-3c05-463f-82ab-6470c5fb7c2f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.812639] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 652.812639] nova-compute[62208]: warnings.warn( [ 652.822071] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Created folder: Instances in parent group-v17469. [ 652.822328] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 652.822522] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 652.822730] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9b275bd3-b641-476a-85af-0e9ecda426f9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.835631] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 652.835631] nova-compute[62208]: warnings.warn( [ 652.841849] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 652.841849] nova-compute[62208]: value = "task-38429" [ 652.841849] nova-compute[62208]: _type = "Task" [ 652.841849] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 652.846715] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 652.846715] nova-compute[62208]: warnings.warn( [ 652.852033] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38429, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 653.347533] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 653.347533] nova-compute[62208]: warnings.warn( [ 653.353707] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38429, 'name': CreateVM_Task} progress is 99%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 653.846377] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 653.846377] nova-compute[62208]: warnings.warn( [ 653.852821] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38429, 'name': CreateVM_Task} progress is 99%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 654.347049] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 654.347049] nova-compute[62208]: warnings.warn( [ 654.353184] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38429, 'name': CreateVM_Task, 'duration_secs': 1.293143} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 654.353368] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 654.353705] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 654.353984] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.356789] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62bf475b-8841-43c8-ac68-243882b1aff7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.367106] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 654.367106] nova-compute[62208]: warnings.warn( [ 654.392524] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Reconfiguring VM instance to enable vnc on port - 5908 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 654.393571] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-5c15692f-bdee-4fb2-bee2-ea621ebb9c13 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.405942] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 654.405942] nova-compute[62208]: warnings.warn( [ 654.413902] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Waiting for the task: (returnval){ [ 654.413902] nova-compute[62208]: value = "task-38430" [ 654.413902] nova-compute[62208]: _type = "Task" [ 654.413902] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 654.422717] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 654.422717] nova-compute[62208]: warnings.warn( [ 654.428735] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Task: {'id': task-38430, 'name': ReconfigVM_Task} progress is 14%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 654.918238] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 654.918238] nova-compute[62208]: warnings.warn( [ 654.923882] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Task: {'id': task-38430, 'name': ReconfigVM_Task, 'duration_secs': 0.132617} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 654.924323] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Reconfigured VM instance to enable vnc on port - 5908 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 654.924610] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.571s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 654.924917] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 654.925114] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 654.925480] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 654.925824] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-da098a8a-66f9-41dc-b16d-882e289ab58d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.927658] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 654.927658] nova-compute[62208]: warnings.warn( [ 654.931795] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Waiting for the task: (returnval){ [ 654.931795] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52cce5ef-919a-b5e4-d6d1-27b15fb0d7e0" [ 654.931795] nova-compute[62208]: _type = "Task" [ 654.931795] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 654.936565] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 654.936565] nova-compute[62208]: warnings.warn( [ 654.942845] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52cce5ef-919a-b5e4-d6d1-27b15fb0d7e0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 655.436437] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 655.436437] nova-compute[62208]: warnings.warn( [ 655.443518] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 655.443778] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 655.443991] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 658.046528] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-271bbde6-e4a4-4e8c-becb-2d1755d473a8 tempest-ServerAddressesNegativeTestJSON-1677873053 tempest-ServerAddressesNegativeTestJSON-1677873053-project-member] Acquiring lock "65d39cb0-8eed-49e2-a854-032d527cd0e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 658.046528] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-271bbde6-e4a4-4e8c-becb-2d1755d473a8 tempest-ServerAddressesNegativeTestJSON-1677873053 tempest-ServerAddressesNegativeTestJSON-1677873053-project-member] Lock "65d39cb0-8eed-49e2-a854-032d527cd0e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 664.196772] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 664.196772] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 664.196772] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 664.196772] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 664.196772] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 664.196772] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 664.196772] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 664.196772] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 664.196772] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 664.196772] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 664.196772] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 664.196772] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 664.197285] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/1f86ad74-c3b5-44dd-b22f-bed3bc4350cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 664.198713] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 664.198991] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Copying Virtual Disk [datastore2] vmware_temp/1f86ad74-c3b5-44dd-b22f-bed3bc4350cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/1f86ad74-c3b5-44dd-b22f-bed3bc4350cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 664.199341] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b05927bb-5918-4c6b-9968-a83a194e0eb2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.202126] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.202126] nova-compute[62208]: warnings.warn( [ 664.208019] nova-compute[62208]: DEBUG oslo_vmware.api [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Waiting for the task: (returnval){ [ 664.208019] nova-compute[62208]: value = "task-38433" [ 664.208019] nova-compute[62208]: _type = "Task" [ 664.208019] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 664.211553] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.211553] nova-compute[62208]: warnings.warn( [ 664.217636] nova-compute[62208]: DEBUG oslo_vmware.api [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Task: {'id': task-38433, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 664.712576] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.712576] nova-compute[62208]: warnings.warn( [ 664.725653] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 664.725653] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 664.726151] nova-compute[62208]: ERROR nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 664.726151] nova-compute[62208]: Faults: ['InvalidArgument'] [ 664.726151] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Traceback (most recent call last): [ 664.726151] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 664.726151] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] yield resources [ 664.726151] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 664.726151] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] self.driver.spawn(context, instance, image_meta, [ 664.726151] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 664.726151] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] self._vmops.spawn(context, instance, image_meta, injected_files, [ 664.726151] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 664.726151] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] self._fetch_image_if_missing(context, vi) [ 664.726151] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 664.726464] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] image_cache(vi, tmp_image_ds_loc) [ 664.726464] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 664.726464] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] vm_util.copy_virtual_disk( [ 664.726464] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 664.726464] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] session._wait_for_task(vmdk_copy_task) [ 664.726464] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 664.726464] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] return self.wait_for_task(task_ref) [ 664.726464] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 664.726464] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] return evt.wait() [ 664.726464] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 664.726464] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] result = hub.switch() [ 664.726464] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 664.726464] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] return self.greenlet.switch() [ 664.726811] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 664.726811] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] self.f(*self.args, **self.kw) [ 664.726811] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 664.726811] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] raise exceptions.translate_fault(task_info.error) [ 664.726811] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 664.726811] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Faults: ['InvalidArgument'] [ 664.726811] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] [ 664.726811] nova-compute[62208]: INFO nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Terminating instance [ 664.728192] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 664.728379] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 664.728938] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Acquiring lock "refresh_cache-f7e43c56-e126-4e5a-944a-bba89f2f9744" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 664.729073] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Acquired lock "refresh_cache-f7e43c56-e126-4e5a-944a-bba89f2f9744" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 664.729254] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 664.730219] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2ef4ca63-e6d6-4d53-a451-f1ef4650677a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.741636] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.741636] nova-compute[62208]: warnings.warn( [ 664.749880] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 664.750079] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 664.754294] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-75b08624-528d-4851-ad8d-8fdf5ac198b5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.758162] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.758162] nova-compute[62208]: warnings.warn( [ 664.759424] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Waiting for the task: (returnval){ [ 664.759424] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]524a943a-5756-1bfc-7727-f97fe856205a" [ 664.759424] nova-compute[62208]: _type = "Task" [ 664.759424] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 664.762871] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.762871] nova-compute[62208]: warnings.warn( [ 664.776585] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 664.776859] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Creating directory with path [datastore2] vmware_temp/0f82ca92-2d13-448e-bd33-626b6e68b66c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 664.777174] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fe7a6c7d-7203-42c2-86cf-bf1529170923 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.779282] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.779282] nova-compute[62208]: warnings.warn( [ 664.782576] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 664.799071] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Created directory with path [datastore2] vmware_temp/0f82ca92-2d13-448e-bd33-626b6e68b66c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 664.799278] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Fetch image to [datastore2] vmware_temp/0f82ca92-2d13-448e-bd33-626b6e68b66c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 664.799443] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/0f82ca92-2d13-448e-bd33-626b6e68b66c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 664.800327] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d40b84e2-c65c-4a77-9d13-c7165d6ecb90 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.802758] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.802758] nova-compute[62208]: warnings.warn( [ 664.810353] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-515cd9a4-1d0a-412b-be2d-a9232782785c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.813720] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.813720] nova-compute[62208]: warnings.warn( [ 664.822776] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-088bb035-8f23-4f1b-97cc-3af3e84323dc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.826850] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.826850] nova-compute[62208]: warnings.warn( [ 664.866971] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 664.868864] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab35fab5-8775-4e9a-b8c9-39dd44502c80 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.871705] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.871705] nova-compute[62208]: warnings.warn( [ 664.876271] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1e19bec7-cb82-4aed-9f9e-57d85a148641 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.878545] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.878545] nova-compute[62208]: warnings.warn( [ 664.880244] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Releasing lock "refresh_cache-f7e43c56-e126-4e5a-944a-bba89f2f9744" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 664.880667] nova-compute[62208]: DEBUG nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 664.880894] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 664.882421] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d859384-5c2d-4fea-b50b-6c84784d0c58 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.885035] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.885035] nova-compute[62208]: warnings.warn( [ 664.891052] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 664.891316] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-69bd81f0-9255-4eeb-802b-fd262ebcf8e1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.892892] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.892892] nova-compute[62208]: warnings.warn( [ 664.907311] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 664.922466] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 664.922679] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 664.922860] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Deleting the datastore file [datastore2] f7e43c56-e126-4e5a-944a-bba89f2f9744 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 664.923495] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2b9659c5-31f1-4dfb-915b-e6726cb05e47 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.925444] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.925444] nova-compute[62208]: warnings.warn( [ 664.931083] nova-compute[62208]: DEBUG oslo_vmware.api [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Waiting for the task: (returnval){ [ 664.931083] nova-compute[62208]: value = "task-38435" [ 664.931083] nova-compute[62208]: _type = "Task" [ 664.931083] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 664.936401] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 664.936401] nova-compute[62208]: warnings.warn( [ 664.945547] nova-compute[62208]: DEBUG oslo_vmware.api [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Task: {'id': task-38435, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 664.978600] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0f82ca92-2d13-448e-bd33-626b6e68b66c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 665.035881] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 665.036077] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0f82ca92-2d13-448e-bd33-626b6e68b66c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 665.436984] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 665.436984] nova-compute[62208]: warnings.warn( [ 665.444915] nova-compute[62208]: DEBUG oslo_vmware.api [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Task: {'id': task-38435, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.033882} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 665.445258] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 665.445505] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 665.445748] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 665.445979] nova-compute[62208]: INFO nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Took 0.57 seconds to destroy the instance on the hypervisor. [ 665.446270] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 665.446514] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 665.446659] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 665.468301] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 665.477302] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 665.486166] nova-compute[62208]: INFO nova.compute.manager [-] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Took 0.04 seconds to deallocate network for instance. [ 665.494321] nova-compute[62208]: DEBUG nova.compute.claims [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9377d0640> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 665.494615] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 665.494917] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 665.952248] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0735ee63-fe72-43e8-b9dc-d9f68b72bf29 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.954817] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 665.954817] nova-compute[62208]: warnings.warn( [ 665.960562] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0470a488-6c03-4e2c-bb4a-762f3d94c8ba {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.963580] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 665.963580] nova-compute[62208]: warnings.warn( [ 665.993640] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fc41124-a9ad-42de-9828-1177493cde62 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.996616] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 665.996616] nova-compute[62208]: warnings.warn( [ 666.002115] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3baddcd2-b685-4384-9bb3-fce7b7a0545e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.006480] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 666.006480] nova-compute[62208]: warnings.warn( [ 666.019093] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 666.027573] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 666.044467] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.549s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 666.045289] nova-compute[62208]: ERROR nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 666.045289] nova-compute[62208]: Faults: ['InvalidArgument'] [ 666.045289] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Traceback (most recent call last): [ 666.045289] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 666.045289] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] self.driver.spawn(context, instance, image_meta, [ 666.045289] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 666.045289] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] self._vmops.spawn(context, instance, image_meta, injected_files, [ 666.045289] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 666.045289] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] self._fetch_image_if_missing(context, vi) [ 666.045289] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 666.045289] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] image_cache(vi, tmp_image_ds_loc) [ 666.045289] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 666.045709] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] vm_util.copy_virtual_disk( [ 666.045709] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 666.045709] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] session._wait_for_task(vmdk_copy_task) [ 666.045709] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 666.045709] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] return self.wait_for_task(task_ref) [ 666.045709] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 666.045709] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] return evt.wait() [ 666.045709] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 666.045709] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] result = hub.switch() [ 666.045709] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 666.045709] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] return self.greenlet.switch() [ 666.045709] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 666.045709] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] self.f(*self.args, **self.kw) [ 666.046063] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 666.046063] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] raise exceptions.translate_fault(task_info.error) [ 666.046063] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 666.046063] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Faults: ['InvalidArgument'] [ 666.046063] nova-compute[62208]: ERROR nova.compute.manager [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] [ 666.047140] nova-compute[62208]: DEBUG nova.compute.utils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 666.049174] nova-compute[62208]: DEBUG nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Build of instance f7e43c56-e126-4e5a-944a-bba89f2f9744 was re-scheduled: A specified parameter was not correct: fileType [ 666.049174] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 666.049603] nova-compute[62208]: DEBUG nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 666.049843] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Acquiring lock "refresh_cache-f7e43c56-e126-4e5a-944a-bba89f2f9744" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 666.049995] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Acquired lock "refresh_cache-f7e43c56-e126-4e5a-944a-bba89f2f9744" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 666.050153] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 666.083086] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 666.111774] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 666.121266] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Releasing lock "refresh_cache-f7e43c56-e126-4e5a-944a-bba89f2f9744" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 666.121496] nova-compute[62208]: DEBUG nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 666.121641] nova-compute[62208]: DEBUG nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 666.121799] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 666.144947] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 666.153727] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 666.163541] nova-compute[62208]: INFO nova.compute.manager [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] Took 0.04 seconds to deallocate network for instance. [ 666.289844] nova-compute[62208]: INFO nova.scheduler.client.report [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Deleted allocations for instance f7e43c56-e126-4e5a-944a-bba89f2f9744 [ 666.310725] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5b862b4-5a52-4d18-aee6-e63479e0aa50 tempest-ServerDiagnosticsTest-807802384 tempest-ServerDiagnosticsTest-807802384-project-member] Lock "f7e43c56-e126-4e5a-944a-bba89f2f9744" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 96.848s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 666.311947] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "f7e43c56-e126-4e5a-944a-bba89f2f9744" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 91.913s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 666.312158] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f7e43c56-e126-4e5a-944a-bba89f2f9744] During sync_power_state the instance has a pending task (spawning). Skip. [ 666.312338] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "f7e43c56-e126-4e5a-944a-bba89f2f9744" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 666.328773] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 666.383562] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 666.383838] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 666.385361] nova-compute[62208]: INFO nova.compute.claims [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 666.752369] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] Acquiring lock "bc074836-1520-44cf-aae0-acbfaa7a77e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 666.752786] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] Lock "bc074836-1520-44cf-aae0-acbfaa7a77e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 666.865445] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b338a351-dc29-40cf-9a28-d2c9f0857b52 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.868601] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 666.868601] nova-compute[62208]: warnings.warn( [ 666.874742] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-315a4f36-f416-4e81-b115-d9c24df70578 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.877845] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 666.877845] nova-compute[62208]: warnings.warn( [ 666.907444] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6304a4c8-5e03-4cfd-ac55-46271ec8f128 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.910110] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 666.910110] nova-compute[62208]: warnings.warn( [ 666.915854] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3698cc04-68b2-490b-a19c-218b7a8b6082 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.919852] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 666.919852] nova-compute[62208]: warnings.warn( [ 666.930456] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 666.939486] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 666.956712] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.573s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 666.957268] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 667.002172] nova-compute[62208]: DEBUG nova.compute.utils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 667.004427] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 667.004725] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 667.017721] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 667.038123] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] No network configured {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1188}} [ 667.038338] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Instance network_info: |[]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 667.102527] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 667.125898] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 667.125898] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 667.126094] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 667.126238] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 667.126388] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 667.126535] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 667.126802] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 667.126900] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 667.127143] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 667.127231] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 667.127410] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 667.128394] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b46f9d3a-f69a-4c39-bfd0-30c12234b886 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 667.130974] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 667.130974] nova-compute[62208]: warnings.warn( [ 667.137393] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c90a95e-886c-4c01-a24e-b84252a4c328 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 667.141541] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 667.141541] nova-compute[62208]: warnings.warn( [ 667.152123] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 667.158759] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Creating folder: Project (ca2450294e7a4a64b3ed38293405eb1e). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 667.158952] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-52793513-f6bb-49e9-85db-5dd74c74d65d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 667.160741] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 667.160741] nova-compute[62208]: warnings.warn( [ 667.170395] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Created folder: Project (ca2450294e7a4a64b3ed38293405eb1e) in parent group-v17427. [ 667.170588] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Creating folder: Instances. Parent ref: group-v17472. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 667.170836] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f7b58e08-e30d-41d5-a149-354bdd1744e4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 667.172456] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 667.172456] nova-compute[62208]: warnings.warn( [ 667.180149] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Created folder: Instances in parent group-v17472. [ 667.180401] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 667.180585] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 667.180781] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bee30ca3-4f43-4c6e-be9f-d58721408e42 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 667.194052] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 667.194052] nova-compute[62208]: warnings.warn( [ 667.199803] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 667.199803] nova-compute[62208]: value = "task-38438" [ 667.199803] nova-compute[62208]: _type = "Task" [ 667.199803] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 667.203112] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 667.203112] nova-compute[62208]: warnings.warn( [ 667.209354] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38438, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 667.703708] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 667.703708] nova-compute[62208]: warnings.warn( [ 667.710101] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38438, 'name': CreateVM_Task, 'duration_secs': 0.266972} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 667.710295] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 667.710632] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 667.710857] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 667.714317] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c477b8d4-5e48-48e5-bacf-07a7f6f05c63 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 667.726637] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 667.726637] nova-compute[62208]: warnings.warn( [ 667.753001] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Reconfiguring VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 667.753370] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-20b2f56a-170f-4a21-963c-adf72121985b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 667.764845] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 667.764845] nova-compute[62208]: warnings.warn( [ 667.771111] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Waiting for the task: (returnval){ [ 667.771111] nova-compute[62208]: value = "task-38439" [ 667.771111] nova-compute[62208]: _type = "Task" [ 667.771111] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 667.774203] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 667.774203] nova-compute[62208]: warnings.warn( [ 667.779960] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Task: {'id': task-38439, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 668.275044] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 668.275044] nova-compute[62208]: warnings.warn( [ 668.281440] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Task: {'id': task-38439, 'name': ReconfigVM_Task, 'duration_secs': 0.123569} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 668.281731] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Reconfigured VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 668.281971] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.571s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 668.282227] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 668.282375] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 668.282689] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 668.282934] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5f7fedc2-2126-4eb6-8221-488879e43d30 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 668.284558] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 668.284558] nova-compute[62208]: warnings.warn( [ 668.288247] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Waiting for the task: (returnval){ [ 668.288247] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52c2c3ef-b09d-bff0-6814-129a6036ef86" [ 668.288247] nova-compute[62208]: _type = "Task" [ 668.288247] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 668.291986] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 668.291986] nova-compute[62208]: warnings.warn( [ 668.298070] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52c2c3ef-b09d-bff0-6814-129a6036ef86, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 668.793242] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 668.793242] nova-compute[62208]: warnings.warn( [ 668.799676] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 668.799978] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 668.800221] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 669.703927] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquiring lock "9b3e4e69-4f4c-48f3-957e-0cfd979b5712" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 669.704190] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Lock "9b3e4e69-4f4c-48f3-957e-0cfd979b5712" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 693.159324] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.159637] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 693.159637] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 693.183372] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 693.183372] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 693.183372] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 693.183372] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 693.183372] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 693.183689] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 693.183689] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 693.183689] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 693.183689] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 693.183689] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 693.183835] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 693.183835] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.184051] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.184206] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 693.184360] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.194814] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 693.195038] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 693.195215] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 693.195360] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 693.196551] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-539d108a-3924-495a-a3d5-29f7805aeaa4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.205129] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 693.205129] nova-compute[62208]: warnings.warn( [ 693.211658] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bec1c2ab-0858-4b8c-b8ff-ec316d7382c5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.215587] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 693.215587] nova-compute[62208]: warnings.warn( [ 693.228496] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fb9caf8-731f-4be6-9d6f-1f79286ed3ec {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.230981] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 693.230981] nova-compute[62208]: warnings.warn( [ 693.237849] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6accb0aa-dcc1-4c92-bff7-a0fb0866436b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.241857] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 693.241857] nova-compute[62208]: warnings.warn( [ 693.271274] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181945MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 693.271274] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 693.271274] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 693.338747] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f9954bd1-8df3-445c-bb4c-ee316b7b0447 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 693.338932] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 99698b8b-8a66-46ce-8bf1-cc00239e644b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 693.339066] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 2e938efc-55d2-4116-8989-354ec339579f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 693.339188] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 75fd2a2c-4ef5-4b42-b309-53cff148c772 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 693.339307] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4ca19153-519c-49e3-bdfd-1f5ea77b24a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 693.339425] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bd0eef47-56e8-45b6-92b1-e81400994572 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 693.339539] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 693.339653] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 693.339764] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9f48db49-1618-4b04-88a6-315c0f9b889a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 693.339902] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 858b585c-7746-4d38-84c9-b3ee719eb406 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 693.351312] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d2839fc7-a754-4d4f-a7f1-e4b17f2126a5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.362515] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 08336643-4254-4447-b7c2-b81054bf9707 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.373265] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance af8885cb-afba-4724-be10-083e16f8bfc4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.384901] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.395555] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d51c3719-bd80-4ad9-945c-c50e16fb3fd1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.414277] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b58fe58a-9965-4f7e-808c-a5d004fd855e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.425259] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance aae0a74f-3985-4a51-bae4-3b8124d7fe90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.440666] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c9b42581-3793-4641-be04-9a4b17b059cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.453001] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.464654] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 61e911d7-b8e9-416e-b73c-574768744974 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.476047] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f5f7e84c-2d39-4929-be15-e7c03fae4319 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.486040] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d4e0170a-0993-4f7f-a7fa-6539bb13a082 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.496462] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 2681fbe1-7ed8-4280-95ac-f98063278b52 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.507984] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3be03d93-aae7-4312-832f-5a61b49753bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.519607] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 78471258-10a4-42e2-8d2a-f30b2baaa5d9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.529113] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5d6be180-d89f-44ba-847e-0ea169316d90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.539310] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4474e61b-0664-40f7-a8ec-be3d14684b10 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.550079] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d1a12f05-6178-44bb-9eb0-b52d806fe91d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.559918] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 65d39cb0-8eed-49e2-a854-032d527cd0e8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.569855] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bc074836-1520-44cf-aae0-acbfaa7a77e9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.585121] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 693.585400] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 693.585532] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 693.969978] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bac803c5-33e0-4b92-b386-501d1201f1bb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.973048] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 693.973048] nova-compute[62208]: warnings.warn( [ 693.980098] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dda6201c-ab28-4f4e-9eff-0356dc3362a4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.983443] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 693.983443] nova-compute[62208]: warnings.warn( [ 694.012901] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64433a98-26ac-4340-aa02-17a08d86ec16 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.015428] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 694.015428] nova-compute[62208]: warnings.warn( [ 694.021667] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56c008dd-eb21-405f-b2fe-8e88f0732df6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.025883] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 694.025883] nova-compute[62208]: warnings.warn( [ 694.038795] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 694.048766] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 694.067739] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 694.068451] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.797s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 695.025910] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 695.026277] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 695.026277] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 695.026419] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 695.136574] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 700.851023] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 700.851023] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 700.851023] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 700.851023] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 700.851023] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 700.851023] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 700.851023] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 700.851023] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 700.851023] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 700.851023] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 700.851023] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 700.851023] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 700.851590] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/d913c11c-e3c6-41a0-b3f6-ec9f53f7e4e3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore1 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 700.852889] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 700.853182] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Copying Virtual Disk [datastore1] vmware_temp/d913c11c-e3c6-41a0-b3f6-ec9f53f7e4e3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore1] vmware_temp/d913c11c-e3c6-41a0-b3f6-ec9f53f7e4e3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 700.853434] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-813406c7-e1b9-4341-add8-79654320e74d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.855901] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 700.855901] nova-compute[62208]: warnings.warn( [ 700.862838] nova-compute[62208]: DEBUG oslo_vmware.api [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Waiting for the task: (returnval){ [ 700.862838] nova-compute[62208]: value = "task-38440" [ 700.862838] nova-compute[62208]: _type = "Task" [ 700.862838] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 700.865990] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 700.865990] nova-compute[62208]: warnings.warn( [ 700.871721] nova-compute[62208]: DEBUG oslo_vmware.api [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Task: {'id': task-38440, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 701.367464] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.367464] nova-compute[62208]: warnings.warn( [ 701.373655] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 701.374051] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Releasing lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 701.374683] nova-compute[62208]: ERROR nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 701.374683] nova-compute[62208]: Faults: ['InvalidArgument'] [ 701.374683] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Traceback (most recent call last): [ 701.374683] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 701.374683] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] yield resources [ 701.374683] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 701.374683] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] self.driver.spawn(context, instance, image_meta, [ 701.374683] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 701.374683] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 701.374683] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 701.374683] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] self._fetch_image_if_missing(context, vi) [ 701.374683] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 701.375011] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] image_cache(vi, tmp_image_ds_loc) [ 701.375011] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 701.375011] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] vm_util.copy_virtual_disk( [ 701.375011] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 701.375011] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] session._wait_for_task(vmdk_copy_task) [ 701.375011] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 701.375011] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] return self.wait_for_task(task_ref) [ 701.375011] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 701.375011] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] return evt.wait() [ 701.375011] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 701.375011] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] result = hub.switch() [ 701.375011] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 701.375011] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] return self.greenlet.switch() [ 701.375318] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 701.375318] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] self.f(*self.args, **self.kw) [ 701.375318] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 701.375318] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] raise exceptions.translate_fault(task_info.error) [ 701.375318] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 701.375318] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Faults: ['InvalidArgument'] [ 701.375318] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] [ 701.375510] nova-compute[62208]: INFO nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Terminating instance [ 701.377601] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Acquired lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 701.377886] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 701.378611] nova-compute[62208]: DEBUG nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 701.378898] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 701.379180] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ed7e1ec2-86e2-4db8-a4ef-d4ac14a199a5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.381559] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e16fa316-4fa5-432a-9bc3-7c0ac2e04525 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.385229] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.385229] nova-compute[62208]: warnings.warn( [ 701.385642] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.385642] nova-compute[62208]: warnings.warn( [ 701.392334] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 701.392334] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-224b5155-e3b5-4dee-8524-2bb2d9293ed1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.393314] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 701.393490] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 701.394132] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-704e81d9-c54b-489c-b628-52a8c660bc6a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.396228] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.396228] nova-compute[62208]: warnings.warn( [ 701.396523] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.396523] nova-compute[62208]: warnings.warn( [ 701.399734] nova-compute[62208]: DEBUG oslo_vmware.api [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Waiting for the task: (returnval){ [ 701.399734] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5205f5c3-b4ae-66bd-7184-9715af5c73ba" [ 701.399734] nova-compute[62208]: _type = "Task" [ 701.399734] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 701.402908] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.402908] nova-compute[62208]: warnings.warn( [ 701.407747] nova-compute[62208]: DEBUG oslo_vmware.api [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5205f5c3-b4ae-66bd-7184-9715af5c73ba, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 701.474971] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 701.475311] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Deleting contents of the VM from datastore datastore1 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 701.475735] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Deleting the datastore file [datastore1] 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 701.476033] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8dcccd31-9d25-44d9-bfa2-6d05f566bae4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.478339] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.478339] nova-compute[62208]: warnings.warn( [ 701.483042] nova-compute[62208]: DEBUG oslo_vmware.api [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Waiting for the task: (returnval){ [ 701.483042] nova-compute[62208]: value = "task-38442" [ 701.483042] nova-compute[62208]: _type = "Task" [ 701.483042] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 701.486826] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.486826] nova-compute[62208]: warnings.warn( [ 701.492782] nova-compute[62208]: DEBUG oslo_vmware.api [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Task: {'id': task-38442, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 701.903794] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.903794] nova-compute[62208]: warnings.warn( [ 701.911314] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 701.911568] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Creating directory with path [datastore1] vmware_temp/e7cc4b90-7f4d-4b42-a51e-cce5501f6b3e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 701.911808] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0d96495d-fecc-4f77-aa79-b7f0b6849b6f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.913736] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.913736] nova-compute[62208]: warnings.warn( [ 701.925814] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Created directory with path [datastore1] vmware_temp/e7cc4b90-7f4d-4b42-a51e-cce5501f6b3e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 701.926024] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Fetch image to [datastore1] vmware_temp/e7cc4b90-7f4d-4b42-a51e-cce5501f6b3e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 701.926192] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore1] vmware_temp/e7cc4b90-7f4d-4b42-a51e-cce5501f6b3e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore1 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 701.926992] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43b3b23a-91bf-4d1c-91b8-da1820ce4e64 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.929576] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.929576] nova-compute[62208]: warnings.warn( [ 701.934690] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8cb05d7-5bb5-4020-bfe8-2f63ee376c59 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.937136] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.937136] nova-compute[62208]: warnings.warn( [ 701.944570] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64a72a71-4bd4-48d3-a4e2-0f04cea3cb7f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.948388] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.948388] nova-compute[62208]: warnings.warn( [ 701.975681] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb6abfe8-9d6c-48f5-8ad1-5de958372e20 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.978204] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.978204] nova-compute[62208]: warnings.warn( [ 701.982567] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e1e80bb2-cba2-4a04-9776-d9375cf92729 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.987829] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.987829] nova-compute[62208]: warnings.warn( [ 701.988226] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 701.988226] nova-compute[62208]: warnings.warn( [ 701.993347] nova-compute[62208]: DEBUG oslo_vmware.api [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Task: {'id': task-38442, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066425} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 701.993618] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 701.993807] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Deleted contents of the VM from datastore datastore1 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 701.993982] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 701.994156] nova-compute[62208]: INFO nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Took 0.62 seconds to destroy the instance on the hypervisor. [ 701.996270] nova-compute[62208]: DEBUG nova.compute.claims [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Aborting claim: <nova.compute.claims.Claim object at 0x7fb93729f1f0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 701.996456] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 701.996689] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 702.007371] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore1 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 702.061591] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e7cc4b90-7f4d-4b42-a51e-cce5501f6b3e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 702.120472] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 702.120684] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e7cc4b90-7f4d-4b42-a51e-cce5501f6b3e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 702.484800] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c8cc71d-5fe6-4d77-834e-fc491d418382 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.487860] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 702.487860] nova-compute[62208]: warnings.warn( [ 702.494687] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dce52b5-1522-4f51-8c4a-54af4659b623 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.497625] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 702.497625] nova-compute[62208]: warnings.warn( [ 702.526342] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf0d5256-dbb5-4311-9136-221361c679d7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.528834] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 702.528834] nova-compute[62208]: warnings.warn( [ 702.534384] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-243ba4db-1650-4065-bfb1-32e4ab50b6aa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.538122] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 702.538122] nova-compute[62208]: warnings.warn( [ 702.547864] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 702.556342] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 702.574776] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.578s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 702.575319] nova-compute[62208]: ERROR nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 702.575319] nova-compute[62208]: Faults: ['InvalidArgument'] [ 702.575319] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Traceback (most recent call last): [ 702.575319] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 702.575319] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] self.driver.spawn(context, instance, image_meta, [ 702.575319] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 702.575319] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 702.575319] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 702.575319] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] self._fetch_image_if_missing(context, vi) [ 702.575319] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 702.575319] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] image_cache(vi, tmp_image_ds_loc) [ 702.575319] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 702.575820] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] vm_util.copy_virtual_disk( [ 702.575820] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 702.575820] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] session._wait_for_task(vmdk_copy_task) [ 702.575820] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 702.575820] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] return self.wait_for_task(task_ref) [ 702.575820] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 702.575820] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] return evt.wait() [ 702.575820] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 702.575820] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] result = hub.switch() [ 702.575820] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 702.575820] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] return self.greenlet.switch() [ 702.575820] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 702.575820] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] self.f(*self.args, **self.kw) [ 702.576194] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 702.576194] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] raise exceptions.translate_fault(task_info.error) [ 702.576194] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 702.576194] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Faults: ['InvalidArgument'] [ 702.576194] nova-compute[62208]: ERROR nova.compute.manager [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] [ 702.576194] nova-compute[62208]: DEBUG nova.compute.utils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 702.577479] nova-compute[62208]: DEBUG nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Build of instance 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f was re-scheduled: A specified parameter was not correct: fileType [ 702.577479] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 702.577922] nova-compute[62208]: DEBUG nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 702.578094] nova-compute[62208]: DEBUG nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 702.578266] nova-compute[62208]: DEBUG nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 702.578428] nova-compute[62208]: DEBUG nova.network.neutron [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 702.919800] nova-compute[62208]: DEBUG nova.network.neutron [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.934843] nova-compute[62208]: INFO nova.compute.manager [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] [instance: 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f] Took 0.36 seconds to deallocate network for instance. [ 703.041991] nova-compute[62208]: INFO nova.scheduler.client.report [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Deleted allocations for instance 29ee4419-201b-4d3a-8e7b-c84a5eb36d1f [ 703.066478] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-efc7d3e9-8fa8-44f9-bebc-3b8d58e32b87 tempest-ServersWithSpecificFlavorTestJSON-1283351527 tempest-ServersWithSpecificFlavorTestJSON-1283351527-project-member] Lock "29ee4419-201b-4d3a-8e7b-c84a5eb36d1f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 102.481s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 703.091921] nova-compute[62208]: DEBUG nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 703.153151] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 703.153151] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 703.154455] nova-compute[62208]: INFO nova.compute.claims [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 703.617157] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d917780f-c869-4f53-9b59-14d4bcf1ef93 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.620080] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 703.620080] nova-compute[62208]: warnings.warn( [ 703.627106] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a78e67b3-fc04-4fa7-8fff-875c54893b53 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.630439] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 703.630439] nova-compute[62208]: warnings.warn( [ 703.660055] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de30dd8b-5ce3-4d27-b5ef-1a4b2c9a6075 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.662563] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 703.662563] nova-compute[62208]: warnings.warn( [ 703.668375] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5d4cfbd-3e4e-4381-a059-6461a12166df {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.672153] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 703.672153] nova-compute[62208]: warnings.warn( [ 703.682962] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 703.692183] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 703.712901] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.560s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 703.713430] nova-compute[62208]: DEBUG nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 703.753113] nova-compute[62208]: DEBUG nova.compute.utils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 703.754820] nova-compute[62208]: DEBUG nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 703.754991] nova-compute[62208]: DEBUG nova.network.neutron [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 703.766097] nova-compute[62208]: DEBUG nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 703.801216] nova-compute[62208]: DEBUG nova.policy [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '18af387867884f67af91db1272225201', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '292576eaff6e47f2ac079ebdf420a2a8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 703.846821] nova-compute[62208]: DEBUG nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 703.868764] nova-compute[62208]: DEBUG nova.virt.hardware [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 703.869033] nova-compute[62208]: DEBUG nova.virt.hardware [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 703.869194] nova-compute[62208]: DEBUG nova.virt.hardware [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 703.869376] nova-compute[62208]: DEBUG nova.virt.hardware [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 703.869522] nova-compute[62208]: DEBUG nova.virt.hardware [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 703.869671] nova-compute[62208]: DEBUG nova.virt.hardware [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 703.869895] nova-compute[62208]: DEBUG nova.virt.hardware [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 703.870076] nova-compute[62208]: DEBUG nova.virt.hardware [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 703.870251] nova-compute[62208]: DEBUG nova.virt.hardware [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 703.870417] nova-compute[62208]: DEBUG nova.virt.hardware [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 703.870595] nova-compute[62208]: DEBUG nova.virt.hardware [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 703.871477] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9548a61-e51b-420e-9ec0-2929bce0dea0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.874990] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 703.874990] nova-compute[62208]: warnings.warn( [ 703.882499] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1de26b68-83be-4ef0-a78a-6a218066a623 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.888018] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 703.888018] nova-compute[62208]: warnings.warn( [ 704.394084] nova-compute[62208]: DEBUG nova.network.neutron [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Successfully created port: 5cd2e2fc-b43f-435f-a083-617d75c07338 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 705.109660] nova-compute[62208]: DEBUG nova.compute.manager [req-647a38d6-16cf-426a-8af4-1ee299c422f8 req-b08fa521-59ee-4ab2-8a2b-062c84f27d73 service nova] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Received event network-vif-plugged-5cd2e2fc-b43f-435f-a083-617d75c07338 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 705.109817] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-647a38d6-16cf-426a-8af4-1ee299c422f8 req-b08fa521-59ee-4ab2-8a2b-062c84f27d73 service nova] Acquiring lock "d2839fc7-a754-4d4f-a7f1-e4b17f2126a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 705.110019] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-647a38d6-16cf-426a-8af4-1ee299c422f8 req-b08fa521-59ee-4ab2-8a2b-062c84f27d73 service nova] Lock "d2839fc7-a754-4d4f-a7f1-e4b17f2126a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 705.110182] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-647a38d6-16cf-426a-8af4-1ee299c422f8 req-b08fa521-59ee-4ab2-8a2b-062c84f27d73 service nova] Lock "d2839fc7-a754-4d4f-a7f1-e4b17f2126a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 705.110341] nova-compute[62208]: DEBUG nova.compute.manager [req-647a38d6-16cf-426a-8af4-1ee299c422f8 req-b08fa521-59ee-4ab2-8a2b-062c84f27d73 service nova] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] No waiting events found dispatching network-vif-plugged-5cd2e2fc-b43f-435f-a083-617d75c07338 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 705.110497] nova-compute[62208]: WARNING nova.compute.manager [req-647a38d6-16cf-426a-8af4-1ee299c422f8 req-b08fa521-59ee-4ab2-8a2b-062c84f27d73 service nova] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Received unexpected event network-vif-plugged-5cd2e2fc-b43f-435f-a083-617d75c07338 for instance with vm_state building and task_state spawning. [ 705.214204] nova-compute[62208]: DEBUG nova.network.neutron [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Successfully updated port: 5cd2e2fc-b43f-435f-a083-617d75c07338 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 705.231793] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Acquiring lock "refresh_cache-d2839fc7-a754-4d4f-a7f1-e4b17f2126a5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 705.231957] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Acquired lock "refresh_cache-d2839fc7-a754-4d4f-a7f1-e4b17f2126a5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 705.232133] nova-compute[62208]: DEBUG nova.network.neutron [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 705.289967] nova-compute[62208]: DEBUG nova.network.neutron [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 705.491098] nova-compute[62208]: DEBUG nova.network.neutron [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Updating instance_info_cache with network_info: [{"id": "5cd2e2fc-b43f-435f-a083-617d75c07338", "address": "fa:16:3e:c3:c1:ee", "network": {"id": "9dde5c55-ee65-4ad4-988d-a572b0a9ac6e", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1500945355-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "292576eaff6e47f2ac079ebdf420a2a8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5446413d-c3b0-4cd2-a962-62240db178ac", "external-id": "nsx-vlan-transportzone-528", "segmentation_id": 528, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5cd2e2fc-b4", "ovs_interfaceid": "5cd2e2fc-b43f-435f-a083-617d75c07338", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 705.504776] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Releasing lock "refresh_cache-d2839fc7-a754-4d4f-a7f1-e4b17f2126a5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 705.505080] nova-compute[62208]: DEBUG nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Instance network_info: |[{"id": "5cd2e2fc-b43f-435f-a083-617d75c07338", "address": "fa:16:3e:c3:c1:ee", "network": {"id": "9dde5c55-ee65-4ad4-988d-a572b0a9ac6e", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1500945355-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "292576eaff6e47f2ac079ebdf420a2a8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5446413d-c3b0-4cd2-a962-62240db178ac", "external-id": "nsx-vlan-transportzone-528", "segmentation_id": 528, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5cd2e2fc-b4", "ovs_interfaceid": "5cd2e2fc-b43f-435f-a083-617d75c07338", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 705.505568] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c3:c1:ee', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5446413d-c3b0-4cd2-a962-62240db178ac', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5cd2e2fc-b43f-435f-a083-617d75c07338', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 705.513520] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Creating folder: Project (292576eaff6e47f2ac079ebdf420a2a8). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 705.513736] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4864dfa5-e021-423d-8edd-05a1fe7671d5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.517716] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 705.517716] nova-compute[62208]: warnings.warn( [ 705.526839] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Created folder: Project (292576eaff6e47f2ac079ebdf420a2a8) in parent group-v17427. [ 705.527058] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Creating folder: Instances. Parent ref: group-v17475. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 705.527300] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c21a8102-7c4f-4b95-aa7b-d202c2fb394d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.530878] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 705.530878] nova-compute[62208]: warnings.warn( [ 705.539272] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Created folder: Instances in parent group-v17475. [ 705.539539] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 705.539744] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 705.539956] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-22d42824-0324-477d-a6de-31c0294faa2e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.554205] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 705.554205] nova-compute[62208]: warnings.warn( [ 705.559581] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 705.559581] nova-compute[62208]: value = "task-38445" [ 705.559581] nova-compute[62208]: _type = "Task" [ 705.559581] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 705.563557] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 705.563557] nova-compute[62208]: warnings.warn( [ 705.567878] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38445, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 706.063531] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 706.063531] nova-compute[62208]: warnings.warn( [ 706.070551] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38445, 'name': CreateVM_Task, 'duration_secs': 0.325674} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 706.070726] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 706.071328] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 706.071556] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 706.074359] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7199ae29-781c-4ffd-a14e-df1993be9380 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.085557] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 706.085557] nova-compute[62208]: warnings.warn( [ 706.108682] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Reconfiguring VM instance to enable vnc on port - 5909 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 706.109061] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-185ce9bc-7e4b-47eb-9624-77fc799e93cd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.118894] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 706.118894] nova-compute[62208]: warnings.warn( [ 706.124709] nova-compute[62208]: DEBUG oslo_vmware.api [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Waiting for the task: (returnval){ [ 706.124709] nova-compute[62208]: value = "task-38446" [ 706.124709] nova-compute[62208]: _type = "Task" [ 706.124709] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 706.127811] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 706.127811] nova-compute[62208]: warnings.warn( [ 706.135004] nova-compute[62208]: DEBUG oslo_vmware.api [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Task: {'id': task-38446, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 706.628978] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 706.628978] nova-compute[62208]: warnings.warn( [ 706.634957] nova-compute[62208]: DEBUG oslo_vmware.api [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Task: {'id': task-38446, 'name': ReconfigVM_Task, 'duration_secs': 0.11334} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 706.635226] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Reconfigured VM instance to enable vnc on port - 5909 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 706.635438] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.564s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 706.635716] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 706.635856] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 706.636186] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 706.636447] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9752ca7e-4345-4e11-a630-b432b1b3a76f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.638154] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 706.638154] nova-compute[62208]: warnings.warn( [ 706.641787] nova-compute[62208]: DEBUG oslo_vmware.api [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Waiting for the task: (returnval){ [ 706.641787] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52416b63-6956-32f0-af16-49f1e4367d48" [ 706.641787] nova-compute[62208]: _type = "Task" [ 706.641787] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 706.644771] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 706.644771] nova-compute[62208]: warnings.warn( [ 706.650367] nova-compute[62208]: DEBUG oslo_vmware.api [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52416b63-6956-32f0-af16-49f1e4367d48, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 707.146791] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 707.146791] nova-compute[62208]: warnings.warn( [ 707.153594] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 707.153985] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 707.154296] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 707.159806] nova-compute[62208]: DEBUG nova.compute.manager [req-94a9a1c8-1c3e-4e5b-8b77-aeb238b7ada1 req-0edc5017-0175-4145-9fe2-2c341c195474 service nova] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Received event network-changed-5cd2e2fc-b43f-435f-a083-617d75c07338 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 707.160192] nova-compute[62208]: DEBUG nova.compute.manager [req-94a9a1c8-1c3e-4e5b-8b77-aeb238b7ada1 req-0edc5017-0175-4145-9fe2-2c341c195474 service nova] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Refreshing instance network info cache due to event network-changed-5cd2e2fc-b43f-435f-a083-617d75c07338. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 707.160511] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-94a9a1c8-1c3e-4e5b-8b77-aeb238b7ada1 req-0edc5017-0175-4145-9fe2-2c341c195474 service nova] Acquiring lock "refresh_cache-d2839fc7-a754-4d4f-a7f1-e4b17f2126a5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 707.160762] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-94a9a1c8-1c3e-4e5b-8b77-aeb238b7ada1 req-0edc5017-0175-4145-9fe2-2c341c195474 service nova] Acquired lock "refresh_cache-d2839fc7-a754-4d4f-a7f1-e4b17f2126a5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 707.161114] nova-compute[62208]: DEBUG nova.network.neutron [req-94a9a1c8-1c3e-4e5b-8b77-aeb238b7ada1 req-0edc5017-0175-4145-9fe2-2c341c195474 service nova] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Refreshing network info cache for port 5cd2e2fc-b43f-435f-a083-617d75c07338 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 707.629726] nova-compute[62208]: DEBUG nova.network.neutron [req-94a9a1c8-1c3e-4e5b-8b77-aeb238b7ada1 req-0edc5017-0175-4145-9fe2-2c341c195474 service nova] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Updated VIF entry in instance network info cache for port 5cd2e2fc-b43f-435f-a083-617d75c07338. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 707.630116] nova-compute[62208]: DEBUG nova.network.neutron [req-94a9a1c8-1c3e-4e5b-8b77-aeb238b7ada1 req-0edc5017-0175-4145-9fe2-2c341c195474 service nova] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Updating instance_info_cache with network_info: [{"id": "5cd2e2fc-b43f-435f-a083-617d75c07338", "address": "fa:16:3e:c3:c1:ee", "network": {"id": "9dde5c55-ee65-4ad4-988d-a572b0a9ac6e", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1500945355-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "292576eaff6e47f2ac079ebdf420a2a8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5446413d-c3b0-4cd2-a962-62240db178ac", "external-id": "nsx-vlan-transportzone-528", "segmentation_id": 528, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5cd2e2fc-b4", "ovs_interfaceid": "5cd2e2fc-b43f-435f-a083-617d75c07338", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.639613] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-94a9a1c8-1c3e-4e5b-8b77-aeb238b7ada1 req-0edc5017-0175-4145-9fe2-2c341c195474 service nova] Releasing lock "refresh_cache-d2839fc7-a754-4d4f-a7f1-e4b17f2126a5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 710.339455] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 710.339455] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 710.339455] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 710.339455] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 710.339455] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 710.339455] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 710.339455] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 710.339455] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 710.339455] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 710.339455] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 710.339455] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 710.339455] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 710.340099] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/0f82ca92-2d13-448e-bd33-626b6e68b66c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 710.341331] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 710.341594] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Copying Virtual Disk [datastore2] vmware_temp/0f82ca92-2d13-448e-bd33-626b6e68b66c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/0f82ca92-2d13-448e-bd33-626b6e68b66c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 710.341898] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8ba1642d-4ac9-4fb3-ad25-57bb309f79c7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.344543] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 710.344543] nova-compute[62208]: warnings.warn( [ 710.351042] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Waiting for the task: (returnval){ [ 710.351042] nova-compute[62208]: value = "task-38447" [ 710.351042] nova-compute[62208]: _type = "Task" [ 710.351042] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 710.355354] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 710.355354] nova-compute[62208]: warnings.warn( [ 710.360901] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Task: {'id': task-38447, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 710.854725] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 710.854725] nova-compute[62208]: warnings.warn( [ 710.861038] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 710.861363] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 710.861964] nova-compute[62208]: ERROR nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 710.861964] nova-compute[62208]: Faults: ['InvalidArgument'] [ 710.861964] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Traceback (most recent call last): [ 710.861964] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 710.861964] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] yield resources [ 710.861964] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 710.861964] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] self.driver.spawn(context, instance, image_meta, [ 710.861964] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 710.861964] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] self._vmops.spawn(context, instance, image_meta, injected_files, [ 710.861964] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 710.861964] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] self._fetch_image_if_missing(context, vi) [ 710.861964] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 710.862284] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] image_cache(vi, tmp_image_ds_loc) [ 710.862284] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 710.862284] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] vm_util.copy_virtual_disk( [ 710.862284] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 710.862284] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] session._wait_for_task(vmdk_copy_task) [ 710.862284] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 710.862284] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] return self.wait_for_task(task_ref) [ 710.862284] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 710.862284] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] return evt.wait() [ 710.862284] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 710.862284] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] result = hub.switch() [ 710.862284] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 710.862284] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] return self.greenlet.switch() [ 710.862599] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 710.862599] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] self.f(*self.args, **self.kw) [ 710.862599] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 710.862599] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] raise exceptions.translate_fault(task_info.error) [ 710.862599] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 710.862599] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Faults: ['InvalidArgument'] [ 710.862599] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] [ 710.862599] nova-compute[62208]: INFO nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Terminating instance [ 710.865111] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Acquiring lock "refresh_cache-f9954bd1-8df3-445c-bb4c-ee316b7b0447" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 710.865111] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Acquired lock "refresh_cache-f9954bd1-8df3-445c-bb4c-ee316b7b0447" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 710.865111] nova-compute[62208]: DEBUG nova.network.neutron [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 710.866491] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 710.866693] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 710.866946] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8aa46548-86eb-46d3-afa2-1204ff5f4120 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.869911] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 710.869911] nova-compute[62208]: warnings.warn( [ 710.877113] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 710.877390] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 710.878709] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-13dc083f-fba6-4d59-af61-6ade80a230ce {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.881654] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 710.881654] nova-compute[62208]: warnings.warn( [ 710.885448] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for the task: (returnval){ [ 710.885448] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52ea101a-af7d-b0ec-cbd8-3e2e7cf993b9" [ 710.885448] nova-compute[62208]: _type = "Task" [ 710.885448] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 710.889100] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 710.889100] nova-compute[62208]: warnings.warn( [ 710.894143] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52ea101a-af7d-b0ec-cbd8-3e2e7cf993b9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 710.901617] nova-compute[62208]: DEBUG nova.network.neutron [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 710.942732] nova-compute[62208]: DEBUG nova.network.neutron [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.953214] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Releasing lock "refresh_cache-f9954bd1-8df3-445c-bb4c-ee316b7b0447" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 710.953214] nova-compute[62208]: DEBUG nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 710.953214] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 710.954207] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44c33d94-db80-47ac-9e8e-0d47617fe49b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.957693] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 710.957693] nova-compute[62208]: warnings.warn( [ 710.963266] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 710.964035] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-021bbb63-be32-4dcf-95c3-2bcfd8eb226b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.965013] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 710.965013] nova-compute[62208]: warnings.warn( [ 711.003391] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 711.003616] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 711.003794] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Deleting the datastore file [datastore2] f9954bd1-8df3-445c-bb4c-ee316b7b0447 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 711.004104] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c4a79a7e-a754-415c-afea-540d74608db0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.005946] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 711.005946] nova-compute[62208]: warnings.warn( [ 711.012141] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Waiting for the task: (returnval){ [ 711.012141] nova-compute[62208]: value = "task-38449" [ 711.012141] nova-compute[62208]: _type = "Task" [ 711.012141] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 711.015958] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 711.015958] nova-compute[62208]: warnings.warn( [ 711.022671] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Task: {'id': task-38449, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 711.390583] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 711.390583] nova-compute[62208]: warnings.warn( [ 711.396528] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 711.396787] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Creating directory with path [datastore2] vmware_temp/64620c00-517d-406a-af70-95858e00365b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 711.397021] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b5ad2743-11e4-4b7a-8b01-578fada1bb67 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.398970] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 711.398970] nova-compute[62208]: warnings.warn( [ 711.409875] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Created directory with path [datastore2] vmware_temp/64620c00-517d-406a-af70-95858e00365b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 711.410095] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Fetch image to [datastore2] vmware_temp/64620c00-517d-406a-af70-95858e00365b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 711.410265] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/64620c00-517d-406a-af70-95858e00365b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 711.411055] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f891597-2f25-4e44-b0c8-fa86b979c9c1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.414099] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 711.414099] nova-compute[62208]: warnings.warn( [ 711.419147] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33791afb-7695-4c94-b7a2-1bb31365549a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.421946] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 711.421946] nova-compute[62208]: warnings.warn( [ 711.429597] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3df7fa0-ce28-4084-bee4-57646ad18b71 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.434375] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 711.434375] nova-compute[62208]: warnings.warn( [ 711.463942] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d293bf75-78a9-4dec-86e5-c68832e21690 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.466818] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 711.466818] nova-compute[62208]: warnings.warn( [ 711.471371] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-52b08791-25b9-4584-80d6-228b8ff59165 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.473415] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 711.473415] nova-compute[62208]: warnings.warn( [ 711.504378] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 711.517343] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 711.517343] nova-compute[62208]: warnings.warn( [ 711.524558] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Task: {'id': task-38449, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.035891} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 711.529081] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 711.529344] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 711.529574] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 711.529834] nova-compute[62208]: INFO nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Took 0.58 seconds to destroy the instance on the hypervisor. [ 711.530071] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 711.530517] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Skipping network deallocation for instance since networking was not requested. {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2276}} [ 711.532938] nova-compute[62208]: DEBUG nova.compute.claims [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9376af190> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 711.533150] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 711.533377] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 711.562021] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/64620c00-517d-406a-af70-95858e00365b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 711.619716] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 711.619855] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/64620c00-517d-406a-af70-95858e00365b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 712.021455] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67ce85a8-e1e0-4a67-8e27-bce1b06f8e93 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.024203] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 712.024203] nova-compute[62208]: warnings.warn( [ 712.030167] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af827f5c-ecdb-472c-b0c1-324648a51ab5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.033150] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 712.033150] nova-compute[62208]: warnings.warn( [ 712.065785] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e564dded-36ce-4152-a482-2ce01e45f15e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.068281] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 712.068281] nova-compute[62208]: warnings.warn( [ 712.074180] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c76468c7-1c45-48c5-87d5-0855a03a0f88 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.078552] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 712.078552] nova-compute[62208]: warnings.warn( [ 712.090876] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 712.099882] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 712.119103] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.586s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 712.119632] nova-compute[62208]: ERROR nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 712.119632] nova-compute[62208]: Faults: ['InvalidArgument'] [ 712.119632] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Traceback (most recent call last): [ 712.119632] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 712.119632] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] self.driver.spawn(context, instance, image_meta, [ 712.119632] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 712.119632] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] self._vmops.spawn(context, instance, image_meta, injected_files, [ 712.119632] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 712.119632] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] self._fetch_image_if_missing(context, vi) [ 712.119632] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 712.119632] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] image_cache(vi, tmp_image_ds_loc) [ 712.119632] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 712.119957] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] vm_util.copy_virtual_disk( [ 712.119957] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 712.119957] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] session._wait_for_task(vmdk_copy_task) [ 712.119957] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 712.119957] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] return self.wait_for_task(task_ref) [ 712.119957] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 712.119957] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] return evt.wait() [ 712.119957] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 712.119957] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] result = hub.switch() [ 712.119957] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 712.119957] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] return self.greenlet.switch() [ 712.119957] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 712.119957] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] self.f(*self.args, **self.kw) [ 712.120317] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 712.120317] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] raise exceptions.translate_fault(task_info.error) [ 712.120317] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 712.120317] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Faults: ['InvalidArgument'] [ 712.120317] nova-compute[62208]: ERROR nova.compute.manager [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] [ 712.120453] nova-compute[62208]: DEBUG nova.compute.utils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 712.121938] nova-compute[62208]: DEBUG nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Build of instance f9954bd1-8df3-445c-bb4c-ee316b7b0447 was re-scheduled: A specified parameter was not correct: fileType [ 712.121938] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 712.122307] nova-compute[62208]: DEBUG nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 712.122526] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Acquiring lock "refresh_cache-f9954bd1-8df3-445c-bb4c-ee316b7b0447" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 712.122667] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Acquired lock "refresh_cache-f9954bd1-8df3-445c-bb4c-ee316b7b0447" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 712.122821] nova-compute[62208]: DEBUG nova.network.neutron [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 712.150373] nova-compute[62208]: DEBUG nova.network.neutron [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 712.188085] nova-compute[62208]: DEBUG nova.network.neutron [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.197876] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Releasing lock "refresh_cache-f9954bd1-8df3-445c-bb4c-ee316b7b0447" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 712.198085] nova-compute[62208]: DEBUG nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 712.198259] nova-compute[62208]: DEBUG nova.compute.manager [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] Skipping network deallocation for instance since networking was not requested. {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2276}} [ 712.302484] nova-compute[62208]: INFO nova.scheduler.client.report [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Deleted allocations for instance f9954bd1-8df3-445c-bb4c-ee316b7b0447 [ 712.322963] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b1b06ac-08ff-4509-8b67-2275e782f4aa tempest-AutoAllocateNetworkTest-632369415 tempest-AutoAllocateNetworkTest-632369415-project-member] Lock "f9954bd1-8df3-445c-bb4c-ee316b7b0447" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 139.469s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 712.324166] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "f9954bd1-8df3-445c-bb4c-ee316b7b0447" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 137.925s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 712.324296] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f9954bd1-8df3-445c-bb4c-ee316b7b0447] During sync_power_state the instance has a pending task (spawning). Skip. [ 712.324464] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "f9954bd1-8df3-445c-bb4c-ee316b7b0447" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 712.340589] nova-compute[62208]: DEBUG nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 712.396178] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 712.396471] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 712.398041] nova-compute[62208]: INFO nova.compute.claims [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 712.831549] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0af129c7-e68b-4328-957a-e81dfdcbfddb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.834198] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 712.834198] nova-compute[62208]: warnings.warn( [ 712.839615] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e531f643-b81e-4d78-a5cb-f6f728baf5a4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.842494] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 712.842494] nova-compute[62208]: warnings.warn( [ 712.869731] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5632fc1-d7d7-4fdc-ac95-81e3a64ddb08 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.872297] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 712.872297] nova-compute[62208]: warnings.warn( [ 712.877883] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df82ac24-4732-43f3-a55c-c29745e9b487 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.881719] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 712.881719] nova-compute[62208]: warnings.warn( [ 712.891588] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 712.902256] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 712.932047] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.524s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 712.932047] nova-compute[62208]: DEBUG nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 712.960997] nova-compute[62208]: DEBUG nova.compute.utils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 712.962828] nova-compute[62208]: DEBUG nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 712.963087] nova-compute[62208]: DEBUG nova.network.neutron [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 712.974223] nova-compute[62208]: DEBUG nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 713.008405] nova-compute[62208]: DEBUG nova.policy [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be1a5332a52c4379b5ad158283a9b50b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '423b83f1a2df47c593cd4fc3a446409b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 713.053038] nova-compute[62208]: DEBUG nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 713.073816] nova-compute[62208]: DEBUG nova.virt.hardware [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 713.074058] nova-compute[62208]: DEBUG nova.virt.hardware [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 713.074216] nova-compute[62208]: DEBUG nova.virt.hardware [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 713.074394] nova-compute[62208]: DEBUG nova.virt.hardware [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 713.074542] nova-compute[62208]: DEBUG nova.virt.hardware [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 713.074693] nova-compute[62208]: DEBUG nova.virt.hardware [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 713.074901] nova-compute[62208]: DEBUG nova.virt.hardware [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 713.075064] nova-compute[62208]: DEBUG nova.virt.hardware [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 713.075234] nova-compute[62208]: DEBUG nova.virt.hardware [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 713.075389] nova-compute[62208]: DEBUG nova.virt.hardware [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 713.075564] nova-compute[62208]: DEBUG nova.virt.hardware [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 713.076451] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-791c46e6-9474-4595-8d13-78c1dd598cfb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.079106] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 713.079106] nova-compute[62208]: warnings.warn( [ 713.085116] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d00fd2c0-6296-4658-8616-beaddfb5de42 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.089112] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 713.089112] nova-compute[62208]: warnings.warn( [ 713.515258] nova-compute[62208]: DEBUG nova.network.neutron [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Successfully created port: 94f095dd-5bb1-4266-a64a-c06dde239855 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 714.753451] nova-compute[62208]: DEBUG nova.compute.manager [req-dfbe3a55-522d-4dc4-8931-3932b964fb0a req-d7200085-b61f-4ad3-992a-4183e4b11424 service nova] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Received event network-vif-plugged-94f095dd-5bb1-4266-a64a-c06dde239855 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 714.753722] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-dfbe3a55-522d-4dc4-8931-3932b964fb0a req-d7200085-b61f-4ad3-992a-4183e4b11424 service nova] Acquiring lock "08336643-4254-4447-b7c2-b81054bf9707-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 714.753874] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-dfbe3a55-522d-4dc4-8931-3932b964fb0a req-d7200085-b61f-4ad3-992a-4183e4b11424 service nova] Lock "08336643-4254-4447-b7c2-b81054bf9707-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 714.754036] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-dfbe3a55-522d-4dc4-8931-3932b964fb0a req-d7200085-b61f-4ad3-992a-4183e4b11424 service nova] Lock "08336643-4254-4447-b7c2-b81054bf9707-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 714.754200] nova-compute[62208]: DEBUG nova.compute.manager [req-dfbe3a55-522d-4dc4-8931-3932b964fb0a req-d7200085-b61f-4ad3-992a-4183e4b11424 service nova] [instance: 08336643-4254-4447-b7c2-b81054bf9707] No waiting events found dispatching network-vif-plugged-94f095dd-5bb1-4266-a64a-c06dde239855 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 714.754364] nova-compute[62208]: WARNING nova.compute.manager [req-dfbe3a55-522d-4dc4-8931-3932b964fb0a req-d7200085-b61f-4ad3-992a-4183e4b11424 service nova] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Received unexpected event network-vif-plugged-94f095dd-5bb1-4266-a64a-c06dde239855 for instance with vm_state building and task_state spawning. [ 715.064999] nova-compute[62208]: DEBUG nova.network.neutron [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Successfully updated port: 94f095dd-5bb1-4266-a64a-c06dde239855 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 715.080157] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Acquiring lock "refresh_cache-08336643-4254-4447-b7c2-b81054bf9707" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 715.080310] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Acquired lock "refresh_cache-08336643-4254-4447-b7c2-b81054bf9707" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 715.080467] nova-compute[62208]: DEBUG nova.network.neutron [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 715.155118] nova-compute[62208]: DEBUG nova.network.neutron [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 715.237279] nova-compute[62208]: DEBUG nova.compute.manager [req-1d3f3b29-40da-4a5d-acef-9855fabf8a55 req-a18c8f93-1009-4269-b273-b7682d1ecd87 service nova] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Received event network-changed-94f095dd-5bb1-4266-a64a-c06dde239855 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 715.237480] nova-compute[62208]: DEBUG nova.compute.manager [req-1d3f3b29-40da-4a5d-acef-9855fabf8a55 req-a18c8f93-1009-4269-b273-b7682d1ecd87 service nova] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Refreshing instance network info cache due to event network-changed-94f095dd-5bb1-4266-a64a-c06dde239855. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 715.237740] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-1d3f3b29-40da-4a5d-acef-9855fabf8a55 req-a18c8f93-1009-4269-b273-b7682d1ecd87 service nova] Acquiring lock "refresh_cache-08336643-4254-4447-b7c2-b81054bf9707" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 715.502081] nova-compute[62208]: DEBUG nova.network.neutron [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Updating instance_info_cache with network_info: [{"id": "94f095dd-5bb1-4266-a64a-c06dde239855", "address": "fa:16:3e:cb:33:25", "network": {"id": "682ed35c-e860-4573-8aaa-585eec31f459", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-133906471-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "423b83f1a2df47c593cd4fc3a446409b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "023d6500-887e-4dc4-bec5-06b40450d9c0", "external-id": "nsx-vlan-transportzone-108", "segmentation_id": 108, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap94f095dd-5b", "ovs_interfaceid": "94f095dd-5bb1-4266-a64a-c06dde239855", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.515907] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Releasing lock "refresh_cache-08336643-4254-4447-b7c2-b81054bf9707" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 715.516251] nova-compute[62208]: DEBUG nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Instance network_info: |[{"id": "94f095dd-5bb1-4266-a64a-c06dde239855", "address": "fa:16:3e:cb:33:25", "network": {"id": "682ed35c-e860-4573-8aaa-585eec31f459", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-133906471-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "423b83f1a2df47c593cd4fc3a446409b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "023d6500-887e-4dc4-bec5-06b40450d9c0", "external-id": "nsx-vlan-transportzone-108", "segmentation_id": 108, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap94f095dd-5b", "ovs_interfaceid": "94f095dd-5bb1-4266-a64a-c06dde239855", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 715.516575] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-1d3f3b29-40da-4a5d-acef-9855fabf8a55 req-a18c8f93-1009-4269-b273-b7682d1ecd87 service nova] Acquired lock "refresh_cache-08336643-4254-4447-b7c2-b81054bf9707" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 715.516762] nova-compute[62208]: DEBUG nova.network.neutron [req-1d3f3b29-40da-4a5d-acef-9855fabf8a55 req-a18c8f93-1009-4269-b273-b7682d1ecd87 service nova] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Refreshing network info cache for port 94f095dd-5bb1-4266-a64a-c06dde239855 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 715.518129] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cb:33:25', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '023d6500-887e-4dc4-bec5-06b40450d9c0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '94f095dd-5bb1-4266-a64a-c06dde239855', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 715.527434] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Creating folder: Project (423b83f1a2df47c593cd4fc3a446409b). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 715.531077] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-068620d4-4941-41cc-b587-8ab193e48594 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.537321] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 715.537321] nova-compute[62208]: warnings.warn( [ 715.548380] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Created folder: Project (423b83f1a2df47c593cd4fc3a446409b) in parent group-v17427. [ 715.548380] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Creating folder: Instances. Parent ref: group-v17478. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 715.548595] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a8f0901d-66ee-4084-b69f-98864b48f491 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.552602] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 715.552602] nova-compute[62208]: warnings.warn( [ 715.560272] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Created folder: Instances in parent group-v17478. [ 715.560741] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 715.560802] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 715.560982] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0051ba8a-41cd-43c7-a7d2-2f10fae8b12d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.576803] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 715.576803] nova-compute[62208]: warnings.warn( [ 715.591473] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 715.591473] nova-compute[62208]: value = "task-38452" [ 715.591473] nova-compute[62208]: _type = "Task" [ 715.591473] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 715.594949] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 715.594949] nova-compute[62208]: warnings.warn( [ 715.600584] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38452, 'name': CreateVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 716.098795] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 716.098795] nova-compute[62208]: warnings.warn( [ 716.104890] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38452, 'name': CreateVM_Task, 'duration_secs': 0.497341} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 716.105122] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 716.105926] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 716.106237] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 716.114884] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b45a58a-3d21-4557-af7f-0546ee1cc245 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.133448] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 716.133448] nova-compute[62208]: warnings.warn( [ 716.155684] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Reconfiguring VM instance to enable vnc on port - 5902 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 716.156931] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-a451c1e1-1d26-4061-ad91-b63e711e7c0c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.167034] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 716.167034] nova-compute[62208]: warnings.warn( [ 716.179810] nova-compute[62208]: DEBUG oslo_vmware.api [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Waiting for the task: (returnval){ [ 716.179810] nova-compute[62208]: value = "task-38453" [ 716.179810] nova-compute[62208]: _type = "Task" [ 716.179810] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 716.182683] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 716.182683] nova-compute[62208]: warnings.warn( [ 716.188466] nova-compute[62208]: DEBUG oslo_vmware.api [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Task: {'id': task-38453, 'name': ReconfigVM_Task} progress is 14%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 716.412102] nova-compute[62208]: DEBUG nova.network.neutron [req-1d3f3b29-40da-4a5d-acef-9855fabf8a55 req-a18c8f93-1009-4269-b273-b7682d1ecd87 service nova] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Updated VIF entry in instance network info cache for port 94f095dd-5bb1-4266-a64a-c06dde239855. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 716.412603] nova-compute[62208]: DEBUG nova.network.neutron [req-1d3f3b29-40da-4a5d-acef-9855fabf8a55 req-a18c8f93-1009-4269-b273-b7682d1ecd87 service nova] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Updating instance_info_cache with network_info: [{"id": "94f095dd-5bb1-4266-a64a-c06dde239855", "address": "fa:16:3e:cb:33:25", "network": {"id": "682ed35c-e860-4573-8aaa-585eec31f459", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-133906471-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "423b83f1a2df47c593cd4fc3a446409b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "023d6500-887e-4dc4-bec5-06b40450d9c0", "external-id": "nsx-vlan-transportzone-108", "segmentation_id": 108, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap94f095dd-5b", "ovs_interfaceid": "94f095dd-5bb1-4266-a64a-c06dde239855", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.422870] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-1d3f3b29-40da-4a5d-acef-9855fabf8a55 req-a18c8f93-1009-4269-b273-b7682d1ecd87 service nova] Releasing lock "refresh_cache-08336643-4254-4447-b7c2-b81054bf9707" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 716.684114] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 716.684114] nova-compute[62208]: warnings.warn( [ 716.689881] nova-compute[62208]: DEBUG oslo_vmware.api [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Task: {'id': task-38453, 'name': ReconfigVM_Task, 'duration_secs': 0.11354} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 716.690156] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Reconfigured VM instance to enable vnc on port - 5902 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 716.690359] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.584s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 716.690601] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 716.690775] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 716.691094] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 716.691347] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-251f5c43-f133-4971-9ea2-72234d720d85 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.692933] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 716.692933] nova-compute[62208]: warnings.warn( [ 716.696336] nova-compute[62208]: DEBUG oslo_vmware.api [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Waiting for the task: (returnval){ [ 716.696336] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f11f04-b479-27d2-8fce-abca165fe93d" [ 716.696336] nova-compute[62208]: _type = "Task" [ 716.696336] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 716.700554] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 716.700554] nova-compute[62208]: warnings.warn( [ 716.705427] nova-compute[62208]: DEBUG oslo_vmware.api [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f11f04-b479-27d2-8fce-abca165fe93d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 717.201015] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 717.201015] nova-compute[62208]: warnings.warn( [ 717.208616] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 717.208616] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 717.208616] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 718.785581] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquiring lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 718.785921] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 719.374447] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f219345c-7bf9-4222-833a-687f7f181060 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquiring lock "e21efb06-821b-4bec-a6d9-f57ae59d038a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 719.374664] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f219345c-7bf9-4222-833a-687f7f181060 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "e21efb06-821b-4bec-a6d9-f57ae59d038a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 750.260056] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 750.260056] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 750.260056] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 750.260056] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 750.260056] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 750.260056] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 750.260056] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 750.260056] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 750.260056] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 750.260056] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 750.260056] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 750.260056] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 750.260649] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/e7cc4b90-7f4d-4b42-a51e-cce5501f6b3e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore1 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 750.261705] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 750.261949] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Copying Virtual Disk [datastore1] vmware_temp/e7cc4b90-7f4d-4b42-a51e-cce5501f6b3e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore1] vmware_temp/e7cc4b90-7f4d-4b42-a51e-cce5501f6b3e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 750.262230] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-61fab71f-6a8f-4c61-8fcb-f89377790ba9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 750.264798] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 750.264798] nova-compute[62208]: warnings.warn( [ 750.270992] nova-compute[62208]: DEBUG oslo_vmware.api [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Waiting for the task: (returnval){ [ 750.270992] nova-compute[62208]: value = "task-38454" [ 750.270992] nova-compute[62208]: _type = "Task" [ 750.270992] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 750.274303] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 750.274303] nova-compute[62208]: warnings.warn( [ 750.279769] nova-compute[62208]: DEBUG oslo_vmware.api [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Task: {'id': task-38454, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 750.775744] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 750.775744] nova-compute[62208]: warnings.warn( [ 750.782200] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 750.782500] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Releasing lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 750.783061] nova-compute[62208]: ERROR nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 750.783061] nova-compute[62208]: Faults: ['InvalidArgument'] [ 750.783061] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Traceback (most recent call last): [ 750.783061] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 750.783061] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] yield resources [ 750.783061] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 750.783061] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] self.driver.spawn(context, instance, image_meta, [ 750.783061] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 750.783061] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 750.783061] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 750.783061] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] self._fetch_image_if_missing(context, vi) [ 750.783061] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 750.783471] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] image_cache(vi, tmp_image_ds_loc) [ 750.783471] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 750.783471] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] vm_util.copy_virtual_disk( [ 750.783471] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 750.783471] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] session._wait_for_task(vmdk_copy_task) [ 750.783471] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 750.783471] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] return self.wait_for_task(task_ref) [ 750.783471] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 750.783471] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] return evt.wait() [ 750.783471] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 750.783471] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] result = hub.switch() [ 750.783471] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 750.783471] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] return self.greenlet.switch() [ 750.783944] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 750.783944] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] self.f(*self.args, **self.kw) [ 750.783944] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 750.783944] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] raise exceptions.translate_fault(task_info.error) [ 750.783944] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 750.783944] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Faults: ['InvalidArgument'] [ 750.783944] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] [ 750.783944] nova-compute[62208]: INFO nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Terminating instance [ 750.785763] nova-compute[62208]: DEBUG nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 750.785956] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 750.786715] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31ef1e6a-f276-4d74-a927-106eb2cde452 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 750.789514] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 750.789514] nova-compute[62208]: warnings.warn( [ 750.795205] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 750.795205] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d3bd13d1-c423-402c-a4aa-201b1f4d1bd7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 750.796501] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 750.796501] nova-compute[62208]: warnings.warn( [ 751.086981] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 751.087214] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Deleting contents of the VM from datastore datastore1 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 751.087400] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Deleting the datastore file [datastore1] e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 751.087701] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-711dcaac-886d-4216-92ca-eea26a7bba90 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 751.089632] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 751.089632] nova-compute[62208]: warnings.warn( [ 751.094470] nova-compute[62208]: DEBUG oslo_vmware.api [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Waiting for the task: (returnval){ [ 751.094470] nova-compute[62208]: value = "task-38456" [ 751.094470] nova-compute[62208]: _type = "Task" [ 751.094470] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 751.097647] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 751.097647] nova-compute[62208]: warnings.warn( [ 751.102851] nova-compute[62208]: DEBUG oslo_vmware.api [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Task: {'id': task-38456, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 751.598988] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 751.598988] nova-compute[62208]: warnings.warn( [ 751.604719] nova-compute[62208]: DEBUG oslo_vmware.api [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Task: {'id': task-38456, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073215} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 751.604977] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 751.605163] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Deleted contents of the VM from datastore datastore1 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 751.605394] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 751.605583] nova-compute[62208]: INFO nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Took 0.82 seconds to destroy the instance on the hypervisor. [ 751.608110] nova-compute[62208]: DEBUG nova.compute.claims [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936e2bf10> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 751.608293] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 751.608609] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 752.038677] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-275e3d23-ebc2-4bea-a8b1-68c01510859e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.041205] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 752.041205] nova-compute[62208]: warnings.warn( [ 752.046998] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74720b69-ba70-4a7d-91ff-7388779a13c1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.051419] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 752.051419] nova-compute[62208]: warnings.warn( [ 752.079257] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0661b30-a353-4fdd-a9ff-1c8c9f26d45e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.081822] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 752.081822] nova-compute[62208]: warnings.warn( [ 752.087387] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a997a8d-6ff9-47f3-bcb4-0ddf6eba73fd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.091125] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 752.091125] nova-compute[62208]: warnings.warn( [ 752.100894] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 752.109989] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 752.128135] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.519s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 752.128717] nova-compute[62208]: ERROR nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 752.128717] nova-compute[62208]: Faults: ['InvalidArgument'] [ 752.128717] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Traceback (most recent call last): [ 752.128717] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 752.128717] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] self.driver.spawn(context, instance, image_meta, [ 752.128717] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 752.128717] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 752.128717] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 752.128717] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] self._fetch_image_if_missing(context, vi) [ 752.128717] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 752.128717] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] image_cache(vi, tmp_image_ds_loc) [ 752.128717] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 752.129072] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] vm_util.copy_virtual_disk( [ 752.129072] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 752.129072] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] session._wait_for_task(vmdk_copy_task) [ 752.129072] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 752.129072] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] return self.wait_for_task(task_ref) [ 752.129072] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 752.129072] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] return evt.wait() [ 752.129072] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 752.129072] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] result = hub.switch() [ 752.129072] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 752.129072] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] return self.greenlet.switch() [ 752.129072] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 752.129072] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] self.f(*self.args, **self.kw) [ 752.129404] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 752.129404] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] raise exceptions.translate_fault(task_info.error) [ 752.129404] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 752.129404] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Faults: ['InvalidArgument'] [ 752.129404] nova-compute[62208]: ERROR nova.compute.manager [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] [ 752.129534] nova-compute[62208]: DEBUG nova.compute.utils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 752.130848] nova-compute[62208]: DEBUG nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Build of instance e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c was re-scheduled: A specified parameter was not correct: fileType [ 752.130848] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 752.131216] nova-compute[62208]: DEBUG nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 752.131388] nova-compute[62208]: DEBUG nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 752.131558] nova-compute[62208]: DEBUG nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 752.131724] nova-compute[62208]: DEBUG nova.network.neutron [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 752.673483] nova-compute[62208]: DEBUG nova.network.neutron [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 752.688988] nova-compute[62208]: INFO nova.compute.manager [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] [instance: e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c] Took 0.56 seconds to deallocate network for instance. [ 752.806962] nova-compute[62208]: INFO nova.scheduler.client.report [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Deleted allocations for instance e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c [ 752.827671] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-935abcbe-87ec-40d6-a6eb-55bfc1cb6f3c tempest-VolumesAssistedSnapshotsTest-322351418 tempest-VolumesAssistedSnapshotsTest-322351418-project-member] Lock "e4e1b466-759a-4e2a-b784-6a3b0a8b7e2c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 150.466s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 752.841862] nova-compute[62208]: DEBUG nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 752.904417] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 752.904706] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 752.906214] nova-compute[62208]: INFO nova.compute.claims [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 753.141002] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 753.141164] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 753.141285] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 753.164513] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 753.164741] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 753.164908] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 753.165027] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 753.165113] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 753.165249] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 753.165348] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 753.165469] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 753.165736] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 753.165934] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 753.166111] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 753.166595] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 753.166742] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 753.330268] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d820c9f-dfc5-42eb-a541-7af3db9c8666 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 753.333074] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 753.333074] nova-compute[62208]: warnings.warn( [ 753.338160] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-359afb42-9f0d-4743-8ac3-098bb3cc5f0d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 753.340967] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 753.340967] nova-compute[62208]: warnings.warn( [ 753.367313] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f559f1d6-077c-4294-89a6-bffc0e88250d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 753.369746] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 753.369746] nova-compute[62208]: warnings.warn( [ 753.374968] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9de0e5be-2aeb-49eb-bf42-1ce34b2468ee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 753.378529] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 753.378529] nova-compute[62208]: warnings.warn( [ 753.388031] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 753.396161] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 753.414004] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.509s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 753.414487] nova-compute[62208]: DEBUG nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 753.450821] nova-compute[62208]: DEBUG nova.compute.utils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 753.452336] nova-compute[62208]: DEBUG nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 753.452512] nova-compute[62208]: DEBUG nova.network.neutron [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 753.463189] nova-compute[62208]: DEBUG nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 753.502617] nova-compute[62208]: DEBUG nova.policy [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a9f470e245d64454926ab9c07ee45f92', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7c7660c6c500410eb713a843713ee74f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 753.542693] nova-compute[62208]: DEBUG nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 753.565315] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 753.565608] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 753.565742] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 753.565927] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 753.566086] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 753.566235] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 753.566443] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 753.566604] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 753.566849] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 753.567163] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 753.567350] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 753.568281] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98c9c42f-5008-4e0a-a42e-a829b971a03d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 753.570876] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 753.570876] nova-compute[62208]: warnings.warn( [ 753.576598] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6d3e6ba-402c-44e2-a845-cd68ea32672b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 753.581076] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 753.581076] nova-compute[62208]: warnings.warn( [ 753.862584] nova-compute[62208]: DEBUG nova.network.neutron [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Successfully created port: a19e8c2e-80b0-4986-8485-f7e977fc2a79 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 754.141413] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.141635] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.141783] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.141942] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.518662] nova-compute[62208]: DEBUG nova.network.neutron [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Successfully updated port: a19e8c2e-80b0-4986-8485-f7e977fc2a79 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 754.532160] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquiring lock "refresh_cache-af8885cb-afba-4724-be10-083e16f8bfc4" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 754.532160] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquired lock "refresh_cache-af8885cb-afba-4724-be10-083e16f8bfc4" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 754.532160] nova-compute[62208]: DEBUG nova.network.neutron [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 754.591036] nova-compute[62208]: DEBUG nova.network.neutron [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 754.739116] nova-compute[62208]: DEBUG nova.compute.manager [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Received event network-vif-plugged-a19e8c2e-80b0-4986-8485-f7e977fc2a79 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 754.739235] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] Acquiring lock "af8885cb-afba-4724-be10-083e16f8bfc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 754.739423] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] Lock "af8885cb-afba-4724-be10-083e16f8bfc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 754.739586] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] Lock "af8885cb-afba-4724-be10-083e16f8bfc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 754.739749] nova-compute[62208]: DEBUG nova.compute.manager [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] No waiting events found dispatching network-vif-plugged-a19e8c2e-80b0-4986-8485-f7e977fc2a79 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 754.739992] nova-compute[62208]: WARNING nova.compute.manager [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Received unexpected event network-vif-plugged-a19e8c2e-80b0-4986-8485-f7e977fc2a79 for instance with vm_state building and task_state spawning. [ 754.740186] nova-compute[62208]: DEBUG nova.compute.manager [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Received event network-changed-a19e8c2e-80b0-4986-8485-f7e977fc2a79 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 754.740596] nova-compute[62208]: DEBUG nova.compute.manager [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Refreshing instance network info cache due to event network-changed-a19e8c2e-80b0-4986-8485-f7e977fc2a79. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 754.740807] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] Acquiring lock "refresh_cache-af8885cb-afba-4724-be10-083e16f8bfc4" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 754.787440] nova-compute[62208]: DEBUG nova.network.neutron [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Updating instance_info_cache with network_info: [{"id": "a19e8c2e-80b0-4986-8485-f7e977fc2a79", "address": "fa:16:3e:d0:3b:be", "network": {"id": "d76cd4d5-b73d-415e-baa1-b83dfca3b907", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1811898223-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7c7660c6c500410eb713a843713ee74f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c13fd8bc-e797-42fe-94ed-6370d3467a7f", "external-id": "nsx-vlan-transportzone-833", "segmentation_id": 833, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa19e8c2e-80", "ovs_interfaceid": "a19e8c2e-80b0-4986-8485-f7e977fc2a79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 754.803961] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Releasing lock "refresh_cache-af8885cb-afba-4724-be10-083e16f8bfc4" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 754.804319] nova-compute[62208]: DEBUG nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Instance network_info: |[{"id": "a19e8c2e-80b0-4986-8485-f7e977fc2a79", "address": "fa:16:3e:d0:3b:be", "network": {"id": "d76cd4d5-b73d-415e-baa1-b83dfca3b907", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1811898223-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7c7660c6c500410eb713a843713ee74f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c13fd8bc-e797-42fe-94ed-6370d3467a7f", "external-id": "nsx-vlan-transportzone-833", "segmentation_id": 833, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa19e8c2e-80", "ovs_interfaceid": "a19e8c2e-80b0-4986-8485-f7e977fc2a79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 754.804637] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] Acquired lock "refresh_cache-af8885cb-afba-4724-be10-083e16f8bfc4" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 754.804819] nova-compute[62208]: DEBUG nova.network.neutron [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Refreshing network info cache for port a19e8c2e-80b0-4986-8485-f7e977fc2a79 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 754.806194] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d0:3b:be', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c13fd8bc-e797-42fe-94ed-6370d3467a7f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a19e8c2e-80b0-4986-8485-f7e977fc2a79', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 754.814012] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Creating folder: Project (7c7660c6c500410eb713a843713ee74f). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 754.818037] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-04706083-0a25-4911-854b-399cd88be6ce {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.820798] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 754.820798] nova-compute[62208]: warnings.warn( [ 754.830441] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Created folder: Project (7c7660c6c500410eb713a843713ee74f) in parent group-v17427. [ 754.830644] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Creating folder: Instances. Parent ref: group-v17481. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 754.830889] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f8832109-7fbf-4942-bd90-4233c6aca6aa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.832705] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 754.832705] nova-compute[62208]: warnings.warn( [ 754.840545] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Created folder: Instances in parent group-v17481. [ 754.840797] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 754.840990] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 754.841190] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0af5b688-0b82-4f5c-b2d2-f0a76adaff37 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.856161] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 754.856161] nova-compute[62208]: warnings.warn( [ 754.862219] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 754.862219] nova-compute[62208]: value = "task-38459" [ 754.862219] nova-compute[62208]: _type = "Task" [ 754.862219] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 754.865331] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 754.865331] nova-compute[62208]: warnings.warn( [ 754.871301] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38459, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 755.136642] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 755.161834] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 755.162094] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 755.176250] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 755.176526] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 755.176645] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 755.176804] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 755.178021] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6aa5055-92b6-4e03-925a-633991a1d378 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.181121] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 755.181121] nova-compute[62208]: warnings.warn( [ 755.186955] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6ace698-371b-4df8-a0f8-eb117982aacf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.191144] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 755.191144] nova-compute[62208]: warnings.warn( [ 755.204429] nova-compute[62208]: DEBUG nova.network.neutron [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Updated VIF entry in instance network info cache for port a19e8c2e-80b0-4986-8485-f7e977fc2a79. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 755.204985] nova-compute[62208]: DEBUG nova.network.neutron [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Updating instance_info_cache with network_info: [{"id": "a19e8c2e-80b0-4986-8485-f7e977fc2a79", "address": "fa:16:3e:d0:3b:be", "network": {"id": "d76cd4d5-b73d-415e-baa1-b83dfca3b907", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1811898223-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7c7660c6c500410eb713a843713ee74f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c13fd8bc-e797-42fe-94ed-6370d3467a7f", "external-id": "nsx-vlan-transportzone-833", "segmentation_id": 833, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa19e8c2e-80", "ovs_interfaceid": "a19e8c2e-80b0-4986-8485-f7e977fc2a79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.207073] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-061c25bb-0f8f-4804-a5c9-3e5a5b50d704 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.211003] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 755.211003] nova-compute[62208]: warnings.warn( [ 755.217511] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30452fd4-9760-47f6-a7d3-ed22bd3433d7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.223779] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f3b83bd0-eca2-4f3d-b718-773f548db4bf req-4afb668b-01ae-4765-ae0b-b7a7d26aaffb service nova] Releasing lock "refresh_cache-af8885cb-afba-4724-be10-083e16f8bfc4" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 755.224357] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 755.224357] nova-compute[62208]: warnings.warn( [ 755.254322] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181912MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 755.254479] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 755.255438] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 755.323132] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 99698b8b-8a66-46ce-8bf1-cc00239e644b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 755.323301] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 2e938efc-55d2-4116-8989-354ec339579f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 755.323430] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 75fd2a2c-4ef5-4b42-b309-53cff148c772 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 755.323582] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4ca19153-519c-49e3-bdfd-1f5ea77b24a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 755.323691] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bd0eef47-56e8-45b6-92b1-e81400994572 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 755.323826] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9f48db49-1618-4b04-88a6-315c0f9b889a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 755.323957] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 858b585c-7746-4d38-84c9-b3ee719eb406 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 755.324128] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d2839fc7-a754-4d4f-a7f1-e4b17f2126a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 755.324254] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 08336643-4254-4447-b7c2-b81054bf9707 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 755.324391] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance af8885cb-afba-4724-be10-083e16f8bfc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 755.346294] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.357317] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d51c3719-bd80-4ad9-945c-c50e16fb3fd1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.367461] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 755.367461] nova-compute[62208]: warnings.warn( [ 755.370457] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b58fe58a-9965-4f7e-808c-a5d004fd855e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.375184] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38459, 'name': CreateVM_Task, 'duration_secs': 0.319797} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 755.376089] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 755.376200] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 755.376426] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 755.379258] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07b779fc-2217-47a6-984d-c5fc7249bbc1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.391679] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance aae0a74f-3985-4a51-bae4-3b8124d7fe90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.392646] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 755.392646] nova-compute[62208]: warnings.warn( [ 755.403144] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c9b42581-3793-4641-be04-9a4b17b059cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.417235] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Reconfiguring VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 755.418599] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.420500] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-afde604a-e1a5-4718-8958-630c3da9775b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.432541] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 61e911d7-b8e9-416e-b73c-574768744974 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.433662] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 755.433662] nova-compute[62208]: warnings.warn( [ 755.439656] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Waiting for the task: (returnval){ [ 755.439656] nova-compute[62208]: value = "task-38460" [ 755.439656] nova-compute[62208]: _type = "Task" [ 755.439656] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 755.445950] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f5f7e84c-2d39-4929-be15-e7c03fae4319 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.447013] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 755.447013] nova-compute[62208]: warnings.warn( [ 755.453772] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Task: {'id': task-38460, 'name': ReconfigVM_Task} progress is 10%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 755.456603] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d4e0170a-0993-4f7f-a7fa-6539bb13a082 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.469164] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 2681fbe1-7ed8-4280-95ac-f98063278b52 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.480075] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3be03d93-aae7-4312-832f-5a61b49753bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.491244] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 78471258-10a4-42e2-8d2a-f30b2baaa5d9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.504201] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5d6be180-d89f-44ba-847e-0ea169316d90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.516583] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4474e61b-0664-40f7-a8ec-be3d14684b10 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.531153] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d1a12f05-6178-44bb-9eb0-b52d806fe91d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.542993] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 65d39cb0-8eed-49e2-a854-032d527cd0e8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.557186] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bc074836-1520-44cf-aae0-acbfaa7a77e9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.566238] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.576706] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.586771] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e21efb06-821b-4bec-a6d9-f57ae59d038a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 755.587124] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 755.587325] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 755.944441] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 755.944441] nova-compute[62208]: warnings.warn( [ 755.950776] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Task: {'id': task-38460, 'name': ReconfigVM_Task, 'duration_secs': 0.114009} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 755.953294] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Reconfigured VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 755.953521] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.577s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 755.953769] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 755.953913] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 755.954356] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 755.954788] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e7b8d934-0fb5-427f-b8cd-89be75920a47 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.956497] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 755.956497] nova-compute[62208]: warnings.warn( [ 755.959980] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Waiting for the task: (returnval){ [ 755.959980] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52a98112-98c6-e3fd-27b4-bf1b2790be70" [ 755.959980] nova-compute[62208]: _type = "Task" [ 755.959980] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 755.974564] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 755.974564] nova-compute[62208]: warnings.warn( [ 755.984415] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 755.984730] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 755.984992] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 756.000619] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23af3062-a448-4d4a-be87-1c5cc3d8a1c9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.003065] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 756.003065] nova-compute[62208]: warnings.warn( [ 756.008697] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe5cfb5c-7c57-4931-a11d-89fa0f81f776 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.012160] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 756.012160] nova-compute[62208]: warnings.warn( [ 756.040373] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a0b9701-5eaa-41a3-a8f9-f2b2773214a5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.042629] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 756.042629] nova-compute[62208]: warnings.warn( [ 756.048667] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56d5fb1b-cc8b-42d4-8dfb-c85fff76573b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.052572] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 756.052572] nova-compute[62208]: warnings.warn( [ 756.062932] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 756.074800] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 756.088993] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 756.088993] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.833s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 757.088141] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 761.651996] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 761.651996] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 761.651996] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 761.651996] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 761.651996] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 761.651996] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 761.651996] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 761.651996] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 761.651996] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 761.651996] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 761.651996] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 761.651996] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 761.652711] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/64620c00-517d-406a-af70-95858e00365b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 761.654057] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 761.654301] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Copying Virtual Disk [datastore2] vmware_temp/64620c00-517d-406a-af70-95858e00365b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/64620c00-517d-406a-af70-95858e00365b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 761.654581] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ed3e4184-420e-483c-9f13-7dec8e83e84f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.657594] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 761.657594] nova-compute[62208]: warnings.warn( [ 761.663241] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for the task: (returnval){ [ 761.663241] nova-compute[62208]: value = "task-38461" [ 761.663241] nova-compute[62208]: _type = "Task" [ 761.663241] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 761.666403] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 761.666403] nova-compute[62208]: warnings.warn( [ 761.672985] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Task: {'id': task-38461, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 762.168167] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.168167] nova-compute[62208]: warnings.warn( [ 762.174614] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 762.174919] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 762.175477] nova-compute[62208]: ERROR nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 762.175477] nova-compute[62208]: Faults: ['InvalidArgument'] [ 762.175477] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Traceback (most recent call last): [ 762.175477] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 762.175477] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] yield resources [ 762.175477] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 762.175477] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] self.driver.spawn(context, instance, image_meta, [ 762.175477] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 762.175477] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 762.175477] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 762.175477] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] self._fetch_image_if_missing(context, vi) [ 762.175477] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 762.175863] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] image_cache(vi, tmp_image_ds_loc) [ 762.175863] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 762.175863] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] vm_util.copy_virtual_disk( [ 762.175863] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 762.175863] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] session._wait_for_task(vmdk_copy_task) [ 762.175863] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 762.175863] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] return self.wait_for_task(task_ref) [ 762.175863] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 762.175863] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] return evt.wait() [ 762.175863] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 762.175863] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] result = hub.switch() [ 762.175863] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 762.175863] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] return self.greenlet.switch() [ 762.176250] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 762.176250] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] self.f(*self.args, **self.kw) [ 762.176250] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 762.176250] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] raise exceptions.translate_fault(task_info.error) [ 762.176250] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 762.176250] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Faults: ['InvalidArgument'] [ 762.176250] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] [ 762.176250] nova-compute[62208]: INFO nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Terminating instance [ 762.177631] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 762.177631] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 762.178172] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "refresh_cache-99698b8b-8a66-46ce-8bf1-cc00239e644b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 762.178490] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquired lock "refresh_cache-99698b8b-8a66-46ce-8bf1-cc00239e644b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 762.179182] nova-compute[62208]: DEBUG nova.network.neutron [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 762.180208] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9d1d7f75-dbeb-4913-a645-488730c3787e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.182367] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.182367] nova-compute[62208]: warnings.warn( [ 762.191235] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 762.191434] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 762.192599] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ce12ed0c-8b2e-42d3-8eee-2323fa95103d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.195715] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.195715] nova-compute[62208]: warnings.warn( [ 762.199566] nova-compute[62208]: DEBUG oslo_vmware.api [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Waiting for the task: (returnval){ [ 762.199566] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]525e75c3-6333-e4d5-8842-4dffd9a61918" [ 762.199566] nova-compute[62208]: _type = "Task" [ 762.199566] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 762.204166] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.204166] nova-compute[62208]: warnings.warn( [ 762.209668] nova-compute[62208]: DEBUG oslo_vmware.api [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]525e75c3-6333-e4d5-8842-4dffd9a61918, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 762.210545] nova-compute[62208]: DEBUG nova.network.neutron [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 762.241843] nova-compute[62208]: DEBUG nova.network.neutron [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.251284] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Releasing lock "refresh_cache-99698b8b-8a66-46ce-8bf1-cc00239e644b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 762.251754] nova-compute[62208]: DEBUG nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 762.252011] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 762.253167] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ffec770-b2a5-4e98-b5f1-c1a1d8cc0554 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.256231] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.256231] nova-compute[62208]: warnings.warn( [ 762.262426] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 762.262707] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0507e849-c458-4e26-8187-a5c7d7d5e219 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.264342] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.264342] nova-compute[62208]: warnings.warn( [ 762.300273] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 762.300741] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 762.300955] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Deleting the datastore file [datastore2] 99698b8b-8a66-46ce-8bf1-cc00239e644b {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 762.301255] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2b507353-0439-4853-aadb-a40a71d7fcd6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.303109] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.303109] nova-compute[62208]: warnings.warn( [ 762.308410] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for the task: (returnval){ [ 762.308410] nova-compute[62208]: value = "task-38463" [ 762.308410] nova-compute[62208]: _type = "Task" [ 762.308410] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 762.311924] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.311924] nova-compute[62208]: warnings.warn( [ 762.317357] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Task: {'id': task-38463, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 762.703214] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.703214] nova-compute[62208]: warnings.warn( [ 762.709619] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 762.709982] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Creating directory with path [datastore2] vmware_temp/9b1257a0-223d-474d-b384-3cd0b2e5e728/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 762.710129] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-21aa4456-b93e-4e70-b595-f1453fe8435f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.711851] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.711851] nova-compute[62208]: warnings.warn( [ 762.722159] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Created directory with path [datastore2] vmware_temp/9b1257a0-223d-474d-b384-3cd0b2e5e728/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 762.722379] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Fetch image to [datastore2] vmware_temp/9b1257a0-223d-474d-b384-3cd0b2e5e728/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 762.722549] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/9b1257a0-223d-474d-b384-3cd0b2e5e728/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 762.723319] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f33eca0-713f-4eab-bf21-3d0d2725461e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.725709] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.725709] nova-compute[62208]: warnings.warn( [ 762.735713] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16d1a1d9-4f31-4c00-8b2a-1f82b918fb63 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.738067] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.738067] nova-compute[62208]: warnings.warn( [ 762.745676] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dcf614a-6d8d-4f31-9060-17b732b4cc1b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.749300] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.749300] nova-compute[62208]: warnings.warn( [ 762.784297] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a71f03c5-fa10-4f35-9945-045f7b46f672 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.786783] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.786783] nova-compute[62208]: warnings.warn( [ 762.791383] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5bc3ec74-a483-4557-b87e-0f933e90ce64 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.793085] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.793085] nova-compute[62208]: warnings.warn( [ 762.815308] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 762.817551] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 762.817551] nova-compute[62208]: warnings.warn( [ 762.823802] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Task: {'id': task-38463, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.032869} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 762.824244] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 762.824560] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 762.824856] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 762.825139] nova-compute[62208]: INFO nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Took 0.57 seconds to destroy the instance on the hypervisor. [ 762.825654] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 762.825932] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 762.826156] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 762.856485] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 762.863920] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.870024] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9b1257a0-223d-474d-b384-3cd0b2e5e728/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 762.928478] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Took 0.10 seconds to deallocate network for instance. [ 762.931034] nova-compute[62208]: DEBUG nova.compute.claims [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Aborting claim: <nova.compute.claims.Claim object at 0x7fb943a141f0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 762.931224] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 762.931454] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 762.934364] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 762.934528] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9b1257a0-223d-474d-b384-3cd0b2e5e728/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 763.390413] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-239512f0-ad70-4a15-aed6-ca2958a9d3b0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.393528] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 763.393528] nova-compute[62208]: warnings.warn( [ 763.399210] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-404dbd72-e1cb-4515-ada2-fc86b5fd18ee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.402373] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 763.402373] nova-compute[62208]: warnings.warn( [ 763.431995] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90941c21-5bc6-481f-b7c5-f708e152cefb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.434686] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 763.434686] nova-compute[62208]: warnings.warn( [ 763.440335] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-945a22f7-3b2c-4951-a384-a105838e068f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.444792] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 763.444792] nova-compute[62208]: warnings.warn( [ 763.455337] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 763.467768] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 763.484563] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.553s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 763.485104] nova-compute[62208]: ERROR nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 763.485104] nova-compute[62208]: Faults: ['InvalidArgument'] [ 763.485104] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Traceback (most recent call last): [ 763.485104] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 763.485104] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] self.driver.spawn(context, instance, image_meta, [ 763.485104] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 763.485104] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 763.485104] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 763.485104] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] self._fetch_image_if_missing(context, vi) [ 763.485104] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 763.485104] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] image_cache(vi, tmp_image_ds_loc) [ 763.485104] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 763.485498] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] vm_util.copy_virtual_disk( [ 763.485498] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 763.485498] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] session._wait_for_task(vmdk_copy_task) [ 763.485498] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 763.485498] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] return self.wait_for_task(task_ref) [ 763.485498] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 763.485498] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] return evt.wait() [ 763.485498] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 763.485498] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] result = hub.switch() [ 763.485498] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 763.485498] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] return self.greenlet.switch() [ 763.485498] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 763.485498] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] self.f(*self.args, **self.kw) [ 763.485801] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 763.485801] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] raise exceptions.translate_fault(task_info.error) [ 763.485801] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 763.485801] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Faults: ['InvalidArgument'] [ 763.485801] nova-compute[62208]: ERROR nova.compute.manager [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] [ 763.485934] nova-compute[62208]: DEBUG nova.compute.utils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 763.487285] nova-compute[62208]: DEBUG nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Build of instance 99698b8b-8a66-46ce-8bf1-cc00239e644b was re-scheduled: A specified parameter was not correct: fileType [ 763.487285] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 763.487680] nova-compute[62208]: DEBUG nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 763.487911] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "refresh_cache-99698b8b-8a66-46ce-8bf1-cc00239e644b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 763.488076] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquired lock "refresh_cache-99698b8b-8a66-46ce-8bf1-cc00239e644b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 763.488240] nova-compute[62208]: DEBUG nova.network.neutron [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 763.514966] nova-compute[62208]: DEBUG nova.network.neutron [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 763.543048] nova-compute[62208]: DEBUG nova.network.neutron [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 763.553179] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Releasing lock "refresh_cache-99698b8b-8a66-46ce-8bf1-cc00239e644b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 763.553389] nova-compute[62208]: DEBUG nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 763.553608] nova-compute[62208]: DEBUG nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 763.553777] nova-compute[62208]: DEBUG nova.network.neutron [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 763.582753] nova-compute[62208]: DEBUG nova.network.neutron [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 763.595826] nova-compute[62208]: DEBUG nova.network.neutron [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 763.606651] nova-compute[62208]: INFO nova.compute.manager [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: 99698b8b-8a66-46ce-8bf1-cc00239e644b] Took 0.05 seconds to deallocate network for instance. [ 763.722370] nova-compute[62208]: INFO nova.scheduler.client.report [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Deleted allocations for instance 99698b8b-8a66-46ce-8bf1-cc00239e644b [ 763.742949] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a3040af2-64c8-4db2-8ad3-5b10b03d0da8 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "99698b8b-8a66-46ce-8bf1-cc00239e644b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 184.512s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 763.785209] nova-compute[62208]: DEBUG nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 763.841921] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 763.842118] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 763.843693] nova-compute[62208]: INFO nova.compute.claims [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 764.265627] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-073c4913-0aaa-4d6b-a877-930f378ef36b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.269485] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 764.269485] nova-compute[62208]: warnings.warn( [ 764.275160] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35d5ec7c-8fdb-48d1-bf0b-20223bcf7d0d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.278180] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 764.278180] nova-compute[62208]: warnings.warn( [ 764.306051] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bb0ecf6-5494-4ab9-82ff-c82403e4d2f7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.308643] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 764.308643] nova-compute[62208]: warnings.warn( [ 764.314329] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79db32ca-0467-4347-a4ad-e4b4d0424d70 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.318271] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 764.318271] nova-compute[62208]: warnings.warn( [ 764.329185] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 764.338615] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 764.357444] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.515s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 764.358014] nova-compute[62208]: DEBUG nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 764.401533] nova-compute[62208]: DEBUG nova.compute.utils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 764.403366] nova-compute[62208]: DEBUG nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 764.403366] nova-compute[62208]: DEBUG nova.network.neutron [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 764.416644] nova-compute[62208]: DEBUG nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 764.462969] nova-compute[62208]: DEBUG nova.policy [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '38647025a1334a08b2af4d0c24cb2012', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98153a183e80454c8e76785de06f8481', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 764.497450] nova-compute[62208]: DEBUG nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 764.519669] nova-compute[62208]: DEBUG nova.virt.hardware [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 764.519923] nova-compute[62208]: DEBUG nova.virt.hardware [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 764.520110] nova-compute[62208]: DEBUG nova.virt.hardware [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 764.520300] nova-compute[62208]: DEBUG nova.virt.hardware [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 764.520448] nova-compute[62208]: DEBUG nova.virt.hardware [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 764.520594] nova-compute[62208]: DEBUG nova.virt.hardware [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 764.520802] nova-compute[62208]: DEBUG nova.virt.hardware [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 764.520962] nova-compute[62208]: DEBUG nova.virt.hardware [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 764.521132] nova-compute[62208]: DEBUG nova.virt.hardware [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 764.521295] nova-compute[62208]: DEBUG nova.virt.hardware [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 764.521469] nova-compute[62208]: DEBUG nova.virt.hardware [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 764.522304] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6540c8f3-b03f-4bef-aba1-2fa2b2d16b60 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.524845] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 764.524845] nova-compute[62208]: warnings.warn( [ 764.530847] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94f7aca8-85f1-4281-aa61-1e16a6d35217 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.535897] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 764.535897] nova-compute[62208]: warnings.warn( [ 764.918968] nova-compute[62208]: DEBUG nova.network.neutron [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Successfully created port: dcefbabc-ae23-48fb-9557-7423f48afc2e {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 765.075814] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "c4512476-9905-4f33-8575-d0a0f24ed4d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 765.076051] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "c4512476-9905-4f33-8575-d0a0f24ed4d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 765.640289] nova-compute[62208]: DEBUG nova.compute.manager [req-f7a19edd-afdb-4c0c-a7eb-f9ff2b276fed req-a9548afc-8788-4e4b-942a-44b7c07de7bb service nova] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Received event network-vif-plugged-dcefbabc-ae23-48fb-9557-7423f48afc2e {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 765.640515] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f7a19edd-afdb-4c0c-a7eb-f9ff2b276fed req-a9548afc-8788-4e4b-942a-44b7c07de7bb service nova] Acquiring lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 765.640748] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f7a19edd-afdb-4c0c-a7eb-f9ff2b276fed req-a9548afc-8788-4e4b-942a-44b7c07de7bb service nova] Lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 765.640979] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f7a19edd-afdb-4c0c-a7eb-f9ff2b276fed req-a9548afc-8788-4e4b-942a-44b7c07de7bb service nova] Lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 765.641101] nova-compute[62208]: DEBUG nova.compute.manager [req-f7a19edd-afdb-4c0c-a7eb-f9ff2b276fed req-a9548afc-8788-4e4b-942a-44b7c07de7bb service nova] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] No waiting events found dispatching network-vif-plugged-dcefbabc-ae23-48fb-9557-7423f48afc2e {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 765.641331] nova-compute[62208]: WARNING nova.compute.manager [req-f7a19edd-afdb-4c0c-a7eb-f9ff2b276fed req-a9548afc-8788-4e4b-942a-44b7c07de7bb service nova] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Received unexpected event network-vif-plugged-dcefbabc-ae23-48fb-9557-7423f48afc2e for instance with vm_state building and task_state spawning. [ 765.745720] nova-compute[62208]: DEBUG nova.network.neutron [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Successfully updated port: dcefbabc-ae23-48fb-9557-7423f48afc2e {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 765.755010] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Acquiring lock "refresh_cache-14c248a0-9f16-40a5-a8c2-06536fdd8cb7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 765.755168] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Acquired lock "refresh_cache-14c248a0-9f16-40a5-a8c2-06536fdd8cb7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 765.755323] nova-compute[62208]: DEBUG nova.network.neutron [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 765.814626] nova-compute[62208]: DEBUG nova.network.neutron [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 766.066595] nova-compute[62208]: DEBUG nova.network.neutron [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Updating instance_info_cache with network_info: [{"id": "dcefbabc-ae23-48fb-9557-7423f48afc2e", "address": "fa:16:3e:ec:e2:d3", "network": {"id": "3c7ef985-46db-4ae1-a121-45e04d89a16b", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-534497313-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "98153a183e80454c8e76785de06f8481", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3c92f34c-1dd7-4dc5-b8e8-f6c55cc5b4b8", "external-id": "nsx-vlan-transportzone-850", "segmentation_id": 850, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdcefbabc-ae", "ovs_interfaceid": "dcefbabc-ae23-48fb-9557-7423f48afc2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 766.101538] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Releasing lock "refresh_cache-14c248a0-9f16-40a5-a8c2-06536fdd8cb7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 766.101866] nova-compute[62208]: DEBUG nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Instance network_info: |[{"id": "dcefbabc-ae23-48fb-9557-7423f48afc2e", "address": "fa:16:3e:ec:e2:d3", "network": {"id": "3c7ef985-46db-4ae1-a121-45e04d89a16b", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-534497313-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "98153a183e80454c8e76785de06f8481", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3c92f34c-1dd7-4dc5-b8e8-f6c55cc5b4b8", "external-id": "nsx-vlan-transportzone-850", "segmentation_id": 850, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdcefbabc-ae", "ovs_interfaceid": "dcefbabc-ae23-48fb-9557-7423f48afc2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 766.102300] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ec:e2:d3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3c92f34c-1dd7-4dc5-b8e8-f6c55cc5b4b8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'dcefbabc-ae23-48fb-9557-7423f48afc2e', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 766.109814] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Creating folder: Project (98153a183e80454c8e76785de06f8481). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 766.110530] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0fe519f0-1339-4899-be4e-52b4176c60a3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.114549] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 766.114549] nova-compute[62208]: warnings.warn( [ 766.124789] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Created folder: Project (98153a183e80454c8e76785de06f8481) in parent group-v17427. [ 766.125157] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Creating folder: Instances. Parent ref: group-v17484. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 766.125580] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b6c4d868-7d9e-49d9-9e79-c081e68d930d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.128659] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 766.128659] nova-compute[62208]: warnings.warn( [ 766.136725] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Created folder: Instances in parent group-v17484. [ 766.137039] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 766.137247] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 766.137497] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-685fb885-1c4c-4fe8-946b-25eb8e9e9ead {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.153666] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 766.153666] nova-compute[62208]: warnings.warn( [ 766.160136] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 766.160136] nova-compute[62208]: value = "task-38466" [ 766.160136] nova-compute[62208]: _type = "Task" [ 766.160136] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 766.163452] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 766.163452] nova-compute[62208]: warnings.warn( [ 766.168802] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38466, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 766.664564] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 766.664564] nova-compute[62208]: warnings.warn( [ 766.671104] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38466, 'name': CreateVM_Task, 'duration_secs': 0.336879} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 766.671331] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 766.672036] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 766.672263] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 766.675130] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bbde124-4c90-4840-ab6e-14b456e835c1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.685159] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 766.685159] nova-compute[62208]: warnings.warn( [ 766.707645] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Reconfiguring VM instance to enable vnc on port - 5903 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 766.708074] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-52f6539a-075e-4eed-bf62-e5ca113dddf4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.719604] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 766.719604] nova-compute[62208]: warnings.warn( [ 766.725663] nova-compute[62208]: DEBUG oslo_vmware.api [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Waiting for the task: (returnval){ [ 766.725663] nova-compute[62208]: value = "task-38467" [ 766.725663] nova-compute[62208]: _type = "Task" [ 766.725663] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 766.728638] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 766.728638] nova-compute[62208]: warnings.warn( [ 766.737053] nova-compute[62208]: DEBUG oslo_vmware.api [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Task: {'id': task-38467, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 767.229819] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 767.229819] nova-compute[62208]: warnings.warn( [ 767.237572] nova-compute[62208]: DEBUG oslo_vmware.api [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Task: {'id': task-38467, 'name': ReconfigVM_Task} progress is 99%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 767.671079] nova-compute[62208]: DEBUG nova.compute.manager [req-4b648e3c-602e-4888-86f4-eba06af20cf3 req-084e6ce2-9c00-4797-8a16-8a904ebdd5ec service nova] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Received event network-changed-dcefbabc-ae23-48fb-9557-7423f48afc2e {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 767.671271] nova-compute[62208]: DEBUG nova.compute.manager [req-4b648e3c-602e-4888-86f4-eba06af20cf3 req-084e6ce2-9c00-4797-8a16-8a904ebdd5ec service nova] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Refreshing instance network info cache due to event network-changed-dcefbabc-ae23-48fb-9557-7423f48afc2e. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 767.671480] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4b648e3c-602e-4888-86f4-eba06af20cf3 req-084e6ce2-9c00-4797-8a16-8a904ebdd5ec service nova] Acquiring lock "refresh_cache-14c248a0-9f16-40a5-a8c2-06536fdd8cb7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 767.671622] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4b648e3c-602e-4888-86f4-eba06af20cf3 req-084e6ce2-9c00-4797-8a16-8a904ebdd5ec service nova] Acquired lock "refresh_cache-14c248a0-9f16-40a5-a8c2-06536fdd8cb7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 767.672228] nova-compute[62208]: DEBUG nova.network.neutron [req-4b648e3c-602e-4888-86f4-eba06af20cf3 req-084e6ce2-9c00-4797-8a16-8a904ebdd5ec service nova] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Refreshing network info cache for port dcefbabc-ae23-48fb-9557-7423f48afc2e {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 767.733050] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 767.733050] nova-compute[62208]: warnings.warn( [ 767.739189] nova-compute[62208]: DEBUG oslo_vmware.api [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Task: {'id': task-38467, 'name': ReconfigVM_Task, 'duration_secs': 0.512031} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 767.739676] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Reconfigured VM instance to enable vnc on port - 5903 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 767.739957] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 1.068s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 767.740828] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 767.740984] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 767.741431] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 767.741739] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aae6ab0a-d1e5-485c-beb6-18d93d35b557 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 767.743393] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 767.743393] nova-compute[62208]: warnings.warn( [ 767.747243] nova-compute[62208]: DEBUG oslo_vmware.api [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Waiting for the task: (returnval){ [ 767.747243] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5253e813-ef21-4310-8cd5-24ae2e535f91" [ 767.747243] nova-compute[62208]: _type = "Task" [ 767.747243] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 767.750995] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 767.750995] nova-compute[62208]: warnings.warn( [ 767.756465] nova-compute[62208]: DEBUG oslo_vmware.api [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5253e813-ef21-4310-8cd5-24ae2e535f91, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 768.026604] nova-compute[62208]: DEBUG nova.network.neutron [req-4b648e3c-602e-4888-86f4-eba06af20cf3 req-084e6ce2-9c00-4797-8a16-8a904ebdd5ec service nova] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Updated VIF entry in instance network info cache for port dcefbabc-ae23-48fb-9557-7423f48afc2e. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 768.026938] nova-compute[62208]: DEBUG nova.network.neutron [req-4b648e3c-602e-4888-86f4-eba06af20cf3 req-084e6ce2-9c00-4797-8a16-8a904ebdd5ec service nova] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Updating instance_info_cache with network_info: [{"id": "dcefbabc-ae23-48fb-9557-7423f48afc2e", "address": "fa:16:3e:ec:e2:d3", "network": {"id": "3c7ef985-46db-4ae1-a121-45e04d89a16b", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-534497313-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "98153a183e80454c8e76785de06f8481", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3c92f34c-1dd7-4dc5-b8e8-f6c55cc5b4b8", "external-id": "nsx-vlan-transportzone-850", "segmentation_id": 850, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdcefbabc-ae", "ovs_interfaceid": "dcefbabc-ae23-48fb-9557-7423f48afc2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 768.040538] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4b648e3c-602e-4888-86f4-eba06af20cf3 req-084e6ce2-9c00-4797-8a16-8a904ebdd5ec service nova] Releasing lock "refresh_cache-14c248a0-9f16-40a5-a8c2-06536fdd8cb7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 768.251272] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 768.251272] nova-compute[62208]: warnings.warn( [ 768.257758] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 768.258034] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 768.258245] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 768.555270] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f98d9b3b-de29-464b-ab9a-9c5055325504 tempest-ServerShowV247Test-1495422308 tempest-ServerShowV247Test-1495422308-project-member] Acquiring lock "b619a949-11d4-4178-9424-54841ee6c26e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 768.555983] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f98d9b3b-de29-464b-ab9a-9c5055325504 tempest-ServerShowV247Test-1495422308 tempest-ServerShowV247Test-1495422308-project-member] Lock "b619a949-11d4-4178-9424-54841ee6c26e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 768.924636] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e50d2ada-6c42-419f-a7a5-5b2dc8e5907c tempest-ServerShowV247Test-1495422308 tempest-ServerShowV247Test-1495422308-project-member] Acquiring lock "ecd5e5ca-bcef-45ed-b7b5-fd572d80bd50" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 768.925003] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e50d2ada-6c42-419f-a7a5-5b2dc8e5907c tempest-ServerShowV247Test-1495422308 tempest-ServerShowV247Test-1495422308-project-member] Lock "ecd5e5ca-bcef-45ed-b7b5-fd572d80bd50" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 779.017328] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquiring lock "2e938efc-55d2-4116-8989-354ec339579f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 779.411977] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquiring lock "75fd2a2c-4ef5-4b42-b309-53cff148c772" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 780.144265] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c873058c-3d57-4fff-ad61-176916589d43 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Acquiring lock "4ca19153-519c-49e3-bdfd-1f5ea77b24a0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.651187] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-208c1493-34fe-4445-981e-41324b7a6b12 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Acquiring lock "bd0eef47-56e8-45b6-92b1-e81400994572" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 800.548101] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquiring lock "9f48db49-1618-4b04-88a6-315c0f9b889a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 805.242897] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquiring lock "858b585c-7746-4d38-84c9-b3ee719eb406" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 805.505198] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1d3f4201-5555-449e-90ea-47d7e7874cdd tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Acquiring lock "d2839fc7-a754-4d4f-a7f1-e4b17f2126a5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 805.647698] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ddb0b9b4-3582-4cdf-a343-0ff9de890198 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Acquiring lock "08336643-4254-4447-b7c2-b81054bf9707" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 807.957406] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cca59baa-f581-4a31-8ae5-a0373e50c294 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Acquiring lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 810.105075] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ed1b3df-2aa7-4a98-ae97-128791dd805c tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquiring lock "af8885cb-afba-4724-be10-083e16f8bfc4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 810.279543] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 810.279543] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 810.279543] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 810.279543] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 810.279543] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 810.279543] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 810.279543] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 810.279543] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 810.279543] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 810.279543] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 810.279543] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 810.279543] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 810.280103] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/9b1257a0-223d-474d-b384-3cd0b2e5e728/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 810.281859] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 810.282157] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Copying Virtual Disk [datastore2] vmware_temp/9b1257a0-223d-474d-b384-3cd0b2e5e728/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/9b1257a0-223d-474d-b384-3cd0b2e5e728/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 810.282509] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-31d3a956-d0d7-4994-877a-80ea6b8f5e09 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.285097] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 810.285097] nova-compute[62208]: warnings.warn( [ 810.290965] nova-compute[62208]: DEBUG oslo_vmware.api [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Waiting for the task: (returnval){ [ 810.290965] nova-compute[62208]: value = "task-38468" [ 810.290965] nova-compute[62208]: _type = "Task" [ 810.290965] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 810.295253] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 810.295253] nova-compute[62208]: warnings.warn( [ 810.300706] nova-compute[62208]: DEBUG oslo_vmware.api [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Task: {'id': task-38468, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 810.798347] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 810.798347] nova-compute[62208]: warnings.warn( [ 810.805063] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 810.805307] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 810.805888] nova-compute[62208]: ERROR nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 810.805888] nova-compute[62208]: Faults: ['InvalidArgument'] [ 810.805888] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] Traceback (most recent call last): [ 810.805888] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 810.805888] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] yield resources [ 810.805888] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 810.805888] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] self.driver.spawn(context, instance, image_meta, [ 810.805888] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 810.805888] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 810.805888] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 810.805888] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] self._fetch_image_if_missing(context, vi) [ 810.805888] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 810.806321] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] image_cache(vi, tmp_image_ds_loc) [ 810.806321] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 810.806321] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] vm_util.copy_virtual_disk( [ 810.806321] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 810.806321] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] session._wait_for_task(vmdk_copy_task) [ 810.806321] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 810.806321] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] return self.wait_for_task(task_ref) [ 810.806321] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 810.806321] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] return evt.wait() [ 810.806321] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 810.806321] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] result = hub.switch() [ 810.806321] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 810.806321] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] return self.greenlet.switch() [ 810.806767] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 810.806767] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] self.f(*self.args, **self.kw) [ 810.806767] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 810.806767] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] raise exceptions.translate_fault(task_info.error) [ 810.806767] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 810.806767] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] Faults: ['InvalidArgument'] [ 810.806767] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] [ 810.806767] nova-compute[62208]: INFO nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Terminating instance [ 810.807801] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 810.808055] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 810.808543] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquiring lock "refresh_cache-2e938efc-55d2-4116-8989-354ec339579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 810.808825] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquired lock "refresh_cache-2e938efc-55d2-4116-8989-354ec339579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 810.808825] nova-compute[62208]: DEBUG nova.network.neutron [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 810.809711] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ffaf0df7-398b-47da-865e-c00de61342c7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.811675] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 810.811675] nova-compute[62208]: warnings.warn( [ 810.820083] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 810.820279] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 810.822041] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b9baa46c-17a9-48cd-9ac8-70e678ec0c77 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.824661] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 810.824661] nova-compute[62208]: warnings.warn( [ 810.827970] nova-compute[62208]: DEBUG oslo_vmware.api [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Waiting for the task: (returnval){ [ 810.827970] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52095b2a-fa44-e80f-2940-0c4bf7d6fd93" [ 810.827970] nova-compute[62208]: _type = "Task" [ 810.827970] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 810.831131] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 810.831131] nova-compute[62208]: warnings.warn( [ 810.837934] nova-compute[62208]: DEBUG oslo_vmware.api [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52095b2a-fa44-e80f-2940-0c4bf7d6fd93, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 810.851784] nova-compute[62208]: DEBUG nova.network.neutron [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 810.879259] nova-compute[62208]: DEBUG nova.network.neutron [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 810.888654] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Releasing lock "refresh_cache-2e938efc-55d2-4116-8989-354ec339579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 810.889201] nova-compute[62208]: DEBUG nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 810.889451] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 810.890738] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62047e4a-846e-420d-88e9-cafc10d6ee1f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.893790] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 810.893790] nova-compute[62208]: warnings.warn( [ 810.899501] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 810.899824] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e2041a9a-a54d-4e88-af90-91dc46c51afb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.901383] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 810.901383] nova-compute[62208]: warnings.warn( [ 810.934258] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 810.934635] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 810.934854] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Deleting the datastore file [datastore2] 2e938efc-55d2-4116-8989-354ec339579f {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 810.935941] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fbcc2a92-d873-406a-baa4-fd8d8733bdf1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.937939] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 810.937939] nova-compute[62208]: warnings.warn( [ 810.943981] nova-compute[62208]: DEBUG oslo_vmware.api [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Waiting for the task: (returnval){ [ 810.943981] nova-compute[62208]: value = "task-38470" [ 810.943981] nova-compute[62208]: _type = "Task" [ 810.943981] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 810.947865] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 810.947865] nova-compute[62208]: warnings.warn( [ 810.953693] nova-compute[62208]: DEBUG oslo_vmware.api [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Task: {'id': task-38470, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 811.141116] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 811.141368] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11219}} [ 811.156955] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] There are 0 instances to clean {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11228}} [ 811.157203] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 811.157344] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances with incomplete migration {{(pid=62208) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11257}} [ 811.166661] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 811.332257] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 811.332257] nova-compute[62208]: warnings.warn( [ 811.338436] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 811.338717] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Creating directory with path [datastore2] vmware_temp/10d9f480-2d0f-4e17-bdcd-022e38abcc48/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 811.338943] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a1915817-82cb-40e6-8dd2-1195f8f05c47 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.340849] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 811.340849] nova-compute[62208]: warnings.warn( [ 811.351634] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Created directory with path [datastore2] vmware_temp/10d9f480-2d0f-4e17-bdcd-022e38abcc48/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 811.351634] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Fetch image to [datastore2] vmware_temp/10d9f480-2d0f-4e17-bdcd-022e38abcc48/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 811.351634] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/10d9f480-2d0f-4e17-bdcd-022e38abcc48/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 811.352303] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24134952-5ed6-49a7-8083-049b79704444 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.354856] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 811.354856] nova-compute[62208]: warnings.warn( [ 811.360312] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ac3f2c4-6e5b-4ad3-8aad-a07261b3f3f8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.362311] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 811.362311] nova-compute[62208]: warnings.warn( [ 811.370597] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67f55239-177f-4fc0-9b73-bed8ed9329f5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.374324] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 811.374324] nova-compute[62208]: warnings.warn( [ 811.401289] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-003b8744-c973-4374-9401-9a8bc55d0480 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.403649] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 811.403649] nova-compute[62208]: warnings.warn( [ 811.408057] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-133b6075-3229-40df-be73-5be619534f22 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.409775] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 811.409775] nova-compute[62208]: warnings.warn( [ 811.433491] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 811.460073] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 811.460073] nova-compute[62208]: warnings.warn( [ 811.461041] nova-compute[62208]: DEBUG oslo_vmware.api [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Task: {'id': task-38470, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.05203} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 811.461329] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 811.461492] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 811.461660] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 811.461834] nova-compute[62208]: INFO nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Took 0.57 seconds to destroy the instance on the hypervisor. [ 811.462096] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 811.462302] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 811.462398] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 2e938efc-55d2-4116-8989-354ec339579f] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 811.483847] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 811.493377] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 811.504533] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Took 0.04 seconds to deallocate network for instance. [ 811.507200] nova-compute[62208]: DEBUG nova.compute.claims [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9375c5d20> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 811.507739] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 811.508095] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 811.512061] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/10d9f480-2d0f-4e17-bdcd-022e38abcc48/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 811.580701] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 811.580920] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/10d9f480-2d0f-4e17-bdcd-022e38abcc48/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 812.035163] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bca7ed5-48b6-43ad-81b6-de80640bebe9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.035465] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 812.035465] nova-compute[62208]: warnings.warn( [ 812.041575] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-215a4ec2-e8d3-43c4-8cca-24a78ae0662a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.046668] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 812.046668] nova-compute[62208]: warnings.warn( [ 812.074588] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa7aa7ec-810f-4ed2-85a2-aff9f0d2bc3c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.078026] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 812.078026] nova-compute[62208]: warnings.warn( [ 812.084187] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36e7689f-16ef-46e0-9fdf-e1472d873633 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.088105] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 812.088105] nova-compute[62208]: warnings.warn( [ 812.101076] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 812.110312] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 812.135081] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.627s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 812.136708] nova-compute[62208]: ERROR nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 812.136708] nova-compute[62208]: Faults: ['InvalidArgument'] [ 812.136708] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] Traceback (most recent call last): [ 812.136708] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 812.136708] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] self.driver.spawn(context, instance, image_meta, [ 812.136708] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 812.136708] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 812.136708] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 812.136708] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] self._fetch_image_if_missing(context, vi) [ 812.136708] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 812.136708] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] image_cache(vi, tmp_image_ds_loc) [ 812.136708] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 812.137114] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] vm_util.copy_virtual_disk( [ 812.137114] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 812.137114] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] session._wait_for_task(vmdk_copy_task) [ 812.137114] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 812.137114] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] return self.wait_for_task(task_ref) [ 812.137114] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 812.137114] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] return evt.wait() [ 812.137114] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 812.137114] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] result = hub.switch() [ 812.137114] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 812.137114] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] return self.greenlet.switch() [ 812.137114] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 812.137114] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] self.f(*self.args, **self.kw) [ 812.137426] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 812.137426] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] raise exceptions.translate_fault(task_info.error) [ 812.137426] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 812.137426] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] Faults: ['InvalidArgument'] [ 812.137426] nova-compute[62208]: ERROR nova.compute.manager [instance: 2e938efc-55d2-4116-8989-354ec339579f] [ 812.142214] nova-compute[62208]: DEBUG nova.compute.utils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 812.144584] nova-compute[62208]: DEBUG nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Build of instance 2e938efc-55d2-4116-8989-354ec339579f was re-scheduled: A specified parameter was not correct: fileType [ 812.144584] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 812.146147] nova-compute[62208]: DEBUG nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 812.146514] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquiring lock "refresh_cache-2e938efc-55d2-4116-8989-354ec339579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 812.146790] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquired lock "refresh_cache-2e938efc-55d2-4116-8989-354ec339579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 812.147069] nova-compute[62208]: DEBUG nova.network.neutron [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 812.199324] nova-compute[62208]: DEBUG nova.network.neutron [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 812.262762] nova-compute[62208]: DEBUG nova.network.neutron [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 812.279900] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Releasing lock "refresh_cache-2e938efc-55d2-4116-8989-354ec339579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 812.280239] nova-compute[62208]: DEBUG nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 812.280472] nova-compute[62208]: DEBUG nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 812.280990] nova-compute[62208]: DEBUG nova.network.neutron [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 812.304129] nova-compute[62208]: DEBUG nova.network.neutron [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 812.313879] nova-compute[62208]: DEBUG nova.network.neutron [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 812.322702] nova-compute[62208]: INFO nova.compute.manager [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Took 0.04 seconds to deallocate network for instance. [ 812.436869] nova-compute[62208]: INFO nova.scheduler.client.report [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Deleted allocations for instance 2e938efc-55d2-4116-8989-354ec339579f [ 812.456164] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0095cf73-f8ee-428f-9d63-eec787141f98 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Lock "2e938efc-55d2-4116-8989-354ec339579f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 232.187s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 812.457566] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Lock "2e938efc-55d2-4116-8989-354ec339579f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 33.440s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.458576] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquiring lock "2e938efc-55d2-4116-8989-354ec339579f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 812.459089] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Lock "2e938efc-55d2-4116-8989-354ec339579f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.459420] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Lock "2e938efc-55d2-4116-8989-354ec339579f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 812.462169] nova-compute[62208]: INFO nova.compute.manager [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Terminating instance [ 812.464913] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquiring lock "refresh_cache-2e938efc-55d2-4116-8989-354ec339579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 812.465215] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Acquired lock "refresh_cache-2e938efc-55d2-4116-8989-354ec339579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 812.465846] nova-compute[62208]: DEBUG nova.network.neutron [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 812.481358] nova-compute[62208]: DEBUG nova.compute.manager [None req-651c8dda-94fb-4000-bf55-b1cc3f5a8f84 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: d51c3719-bd80-4ad9-945c-c50e16fb3fd1] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 812.502446] nova-compute[62208]: DEBUG nova.network.neutron [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 812.514016] nova-compute[62208]: DEBUG nova.compute.manager [None req-651c8dda-94fb-4000-bf55-b1cc3f5a8f84 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: d51c3719-bd80-4ad9-945c-c50e16fb3fd1] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 812.542115] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-651c8dda-94fb-4000-bf55-b1cc3f5a8f84 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "d51c3719-bd80-4ad9-945c-c50e16fb3fd1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 200.493s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 812.551902] nova-compute[62208]: DEBUG nova.network.neutron [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 812.561904] nova-compute[62208]: DEBUG nova.compute.manager [None req-4783afe8-73df-4225-b786-f99c264c99ba tempest-ServersTestBootFromVolume-245018572 tempest-ServersTestBootFromVolume-245018572-project-member] [instance: b58fe58a-9965-4f7e-808c-a5d004fd855e] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 812.567103] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Releasing lock "refresh_cache-2e938efc-55d2-4116-8989-354ec339579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 812.567736] nova-compute[62208]: DEBUG nova.compute.manager [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 812.568263] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 812.569001] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3d0846b1-ad4c-43b8-a8fd-77905f518115 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.571767] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 812.571767] nova-compute[62208]: warnings.warn( [ 812.581188] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bea06321-c1a8-4837-95f1-4a135502921d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.595949] nova-compute[62208]: DEBUG nova.compute.manager [None req-4783afe8-73df-4225-b786-f99c264c99ba tempest-ServersTestBootFromVolume-245018572 tempest-ServersTestBootFromVolume-245018572-project-member] [instance: b58fe58a-9965-4f7e-808c-a5d004fd855e] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 812.597308] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 812.597308] nova-compute[62208]: warnings.warn( [ 812.617610] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2e938efc-55d2-4116-8989-354ec339579f could not be found. [ 812.618181] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 812.618552] nova-compute[62208]: INFO nova.compute.manager [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Took 0.05 seconds to destroy the instance on the hypervisor. [ 812.619354] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 812.623362] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 812.623625] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 2e938efc-55d2-4116-8989-354ec339579f] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 812.639328] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4783afe8-73df-4225-b786-f99c264c99ba tempest-ServersTestBootFromVolume-245018572 tempest-ServersTestBootFromVolume-245018572-project-member] Lock "b58fe58a-9965-4f7e-808c-a5d004fd855e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 199.040s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 812.650015] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 812.653445] nova-compute[62208]: DEBUG nova.compute.manager [None req-b098499c-d98a-4bf0-8b5d-c1074793399a tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: aae0a74f-3985-4a51-bae4-3b8124d7fe90] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 812.658367] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 812.667119] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 2e938efc-55d2-4116-8989-354ec339579f] Took 0.04 seconds to deallocate network for instance. [ 812.692732] nova-compute[62208]: DEBUG nova.compute.manager [None req-b098499c-d98a-4bf0-8b5d-c1074793399a tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: aae0a74f-3985-4a51-bae4-3b8124d7fe90] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 812.721474] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b098499c-d98a-4bf0-8b5d-c1074793399a tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "aae0a74f-3985-4a51-bae4-3b8124d7fe90" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 197.040s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 812.749271] nova-compute[62208]: DEBUG nova.compute.manager [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 812.834903] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 812.835238] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.837185] nova-compute[62208]: INFO nova.compute.claims [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 812.864824] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7add04e8-1a5c-4bbb-9e67-0ec920b29994 tempest-ServerDiagnosticsNegativeTest-1379746467 tempest-ServerDiagnosticsNegativeTest-1379746467-project-member] Lock "2e938efc-55d2-4116-8989-354ec339579f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.407s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 813.331292] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31492ec6-b952-4a89-8e24-c42943f2723e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.334347] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 813.334347] nova-compute[62208]: warnings.warn( [ 813.340088] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-028a8f60-6324-430a-bc50-40eb1b941f1e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.343670] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 813.343670] nova-compute[62208]: warnings.warn( [ 813.377009] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9613adc2-502a-4984-9801-ab81ceae571b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.380740] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 813.380740] nova-compute[62208]: warnings.warn( [ 813.386294] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40ce739e-f042-470b-8d7b-ce577f078c83 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.390255] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 813.390255] nova-compute[62208]: warnings.warn( [ 813.400293] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 813.413343] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 813.435329] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.600s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 813.435838] nova-compute[62208]: DEBUG nova.compute.manager [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 813.439305] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Acquiring lock "c9b42581-3793-4641-be04-9a4b17b059cb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 813.480636] nova-compute[62208]: DEBUG nova.compute.claims [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9371a6230> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 813.481448] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 813.481448] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 813.759044] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquiring lock "17dbfb9d-4ec1-4937-8bb7-343101f8f61b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 813.759284] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "17dbfb9d-4ec1-4937-8bb7-343101f8f61b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 813.996557] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12632faa-a358-42ee-acb0-444e939180df {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.999045] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 813.999045] nova-compute[62208]: warnings.warn( [ 814.004675] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76b055f4-bc64-466a-9580-0dfbe5a45490 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.008113] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 814.008113] nova-compute[62208]: warnings.warn( [ 814.035717] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3675354a-9daf-4b14-9da4-ce0875e03da9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.038519] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 814.038519] nova-compute[62208]: warnings.warn( [ 814.044568] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dacea14-646a-40c0-abfb-94d217d51458 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.048911] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 814.048911] nova-compute[62208]: warnings.warn( [ 814.060470] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 814.069700] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 814.088837] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.608s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 814.089612] nova-compute[62208]: DEBUG nova.compute.utils [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Conflict updating instance c9b42581-3793-4641-be04-9a4b17b059cb. Expected: {'task_state': [None]}. Actual: {'task_state': 'deleting'} {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 814.091511] nova-compute[62208]: DEBUG nova.compute.manager [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Instance disappeared during build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2504}} [ 814.091687] nova-compute[62208]: DEBUG nova.compute.manager [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 814.091905] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Acquiring lock "refresh_cache-c9b42581-3793-4641-be04-9a4b17b059cb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 814.092062] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Acquired lock "refresh_cache-c9b42581-3793-4641-be04-9a4b17b059cb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 814.092223] nova-compute[62208]: DEBUG nova.network.neutron [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 814.139061] nova-compute[62208]: DEBUG nova.network.neutron [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 814.173084] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 814.173273] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 814.173395] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 814.202120] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 814.202282] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 814.202350] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 814.202483] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 814.202611] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 814.202736] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 814.202858] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 814.202979] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 814.203096] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 814.203217] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 814.203735] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 814.203913] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 814.204090] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 814.204266] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 814.295273] nova-compute[62208]: DEBUG nova.network.neutron [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 814.304454] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Releasing lock "refresh_cache-c9b42581-3793-4641-be04-9a4b17b059cb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 814.304681] nova-compute[62208]: DEBUG nova.compute.manager [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 814.304940] nova-compute[62208]: DEBUG nova.compute.manager [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 814.305105] nova-compute[62208]: DEBUG nova.network.neutron [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 814.328630] nova-compute[62208]: DEBUG nova.network.neutron [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 814.338271] nova-compute[62208]: DEBUG nova.network.neutron [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 814.354803] nova-compute[62208]: INFO nova.compute.manager [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Took 0.04 seconds to deallocate network for instance. [ 814.556409] nova-compute[62208]: INFO nova.scheduler.client.report [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Deleted allocations for instance c9b42581-3793-4641-be04-9a4b17b059cb [ 814.556800] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0537494f-91cd-4696-9be7-d0cc8dc07593 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Lock "c9b42581-3793-4641-be04-9a4b17b059cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 198.269s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 814.559150] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Lock "c9b42581-3793-4641-be04-9a4b17b059cb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 1.120s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 814.559401] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Acquiring lock "c9b42581-3793-4641-be04-9a4b17b059cb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 814.559618] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Lock "c9b42581-3793-4641-be04-9a4b17b059cb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 814.559766] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Lock "c9b42581-3793-4641-be04-9a4b17b059cb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 814.563488] nova-compute[62208]: INFO nova.compute.manager [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Terminating instance [ 814.565520] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Acquiring lock "refresh_cache-c9b42581-3793-4641-be04-9a4b17b059cb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 814.565702] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Acquired lock "refresh_cache-c9b42581-3793-4641-be04-9a4b17b059cb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 814.565900] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 814.577256] nova-compute[62208]: DEBUG nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 814.610287] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 814.643487] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 814.643540] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 814.644994] nova-compute[62208]: INFO nova.compute.claims [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 814.751603] nova-compute[62208]: DEBUG nova.network.neutron [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 814.766433] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Releasing lock "refresh_cache-c9b42581-3793-4641-be04-9a4b17b059cb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 814.767051] nova-compute[62208]: DEBUG nova.compute.manager [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 814.767694] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 814.768298] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d588ab5b-d202-4636-b20b-235174939836 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.770727] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 814.770727] nova-compute[62208]: warnings.warn( [ 814.780471] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-951dffc0-09f7-4e20-a97b-4aa32f71281f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.793539] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 814.793539] nova-compute[62208]: warnings.warn( [ 814.819282] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c9b42581-3793-4641-be04-9a4b17b059cb could not be found. [ 814.819489] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 814.819667] nova-compute[62208]: INFO nova.compute.manager [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Took 0.05 seconds to destroy the instance on the hypervisor. [ 814.820077] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 814.821393] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 814.821393] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 814.840995] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 814.856593] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 814.867510] nova-compute[62208]: INFO nova.compute.manager [-] [instance: c9b42581-3793-4641-be04-9a4b17b059cb] Took 0.05 seconds to deallocate network for instance. [ 815.023928] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f5fe2540-c2f9-4ebb-898f-eeb7f94e9469 tempest-ServersTestJSON-1344348499 tempest-ServersTestJSON-1344348499-project-member] Lock "c9b42581-3793-4641-be04-9a4b17b059cb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.465s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 815.141024] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.141245] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.141407] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.151081] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 815.200912] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f99a752-a6bf-4079-9b91-08a5101a2119 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.203761] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 815.203761] nova-compute[62208]: warnings.warn( [ 815.209338] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4e8d3a1-9029-4197-a75f-f8bcd6044d51 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.212357] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 815.212357] nova-compute[62208]: warnings.warn( [ 815.255843] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4e6c126-79d6-48e3-b8fb-6008a2565fe4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.259585] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 815.259585] nova-compute[62208]: warnings.warn( [ 815.265343] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-665f9359-c9e3-488b-a84c-6cab84452052 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.269108] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 815.269108] nova-compute[62208]: warnings.warn( [ 815.279083] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 815.289772] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 815.311497] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.668s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 815.312018] nova-compute[62208]: DEBUG nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 815.314415] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.163s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 815.316048] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 815.316048] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 815.316048] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-610e24da-4213-4535-89e5-77579f01f988 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.319117] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 815.319117] nova-compute[62208]: warnings.warn( [ 815.325235] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c68731b7-a650-4eac-b54a-792a02ab5176 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.330128] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 815.330128] nova-compute[62208]: warnings.warn( [ 815.343431] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44166613-435c-4f6d-add8-f320cf1397e2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.346264] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 815.346264] nova-compute[62208]: warnings.warn( [ 815.351806] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66d684ec-ed37-4e5e-8129-efae2f9aa4f6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.355141] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 815.355141] nova-compute[62208]: warnings.warn( [ 815.359797] nova-compute[62208]: DEBUG nova.compute.utils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 815.389461] nova-compute[62208]: DEBUG nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 815.389698] nova-compute[62208]: DEBUG nova.network.neutron [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 815.393141] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181829MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 815.393369] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 815.393612] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 815.395764] nova-compute[62208]: DEBUG nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 815.474567] nova-compute[62208]: DEBUG nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 815.478010] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 75fd2a2c-4ef5-4b42-b309-53cff148c772 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 815.478111] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4ca19153-519c-49e3-bdfd-1f5ea77b24a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 815.478239] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bd0eef47-56e8-45b6-92b1-e81400994572 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 815.478361] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9f48db49-1618-4b04-88a6-315c0f9b889a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 815.478478] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 858b585c-7746-4d38-84c9-b3ee719eb406 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 815.478615] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d2839fc7-a754-4d4f-a7f1-e4b17f2126a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 815.478767] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 08336643-4254-4447-b7c2-b81054bf9707 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 815.478889] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance af8885cb-afba-4724-be10-083e16f8bfc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 815.479005] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 815.479118] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 815.490479] nova-compute[62208]: DEBUG nova.policy [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a60bd69612634010a578bff6d2526051', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7b1b3e37008d4f8aa2e955b8f91b9489', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 815.495389] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 61e911d7-b8e9-416e-b73c-574768744974 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.507012] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f5f7e84c-2d39-4929-be15-e7c03fae4319 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.509438] nova-compute[62208]: DEBUG nova.virt.hardware [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 815.509617] nova-compute[62208]: DEBUG nova.virt.hardware [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 815.509776] nova-compute[62208]: DEBUG nova.virt.hardware [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 815.509981] nova-compute[62208]: DEBUG nova.virt.hardware [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 815.510140] nova-compute[62208]: DEBUG nova.virt.hardware [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 815.510968] nova-compute[62208]: DEBUG nova.virt.hardware [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 815.510968] nova-compute[62208]: DEBUG nova.virt.hardware [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 815.510968] nova-compute[62208]: DEBUG nova.virt.hardware [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 815.510968] nova-compute[62208]: DEBUG nova.virt.hardware [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 815.510968] nova-compute[62208]: DEBUG nova.virt.hardware [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 815.511199] nova-compute[62208]: DEBUG nova.virt.hardware [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 815.512249] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4a7f061-1caf-4258-b914-778d3c7ae533 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.515430] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 815.515430] nova-compute[62208]: warnings.warn( [ 815.517399] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d4e0170a-0993-4f7f-a7fa-6539bb13a082 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.523008] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97e7f075-02b3-4198-ae7a-239320b63f5a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.526995] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 815.526995] nova-compute[62208]: warnings.warn( [ 815.541664] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 2681fbe1-7ed8-4280-95ac-f98063278b52 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.558487] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3be03d93-aae7-4312-832f-5a61b49753bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.570484] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 78471258-10a4-42e2-8d2a-f30b2baaa5d9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.583199] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5d6be180-d89f-44ba-847e-0ea169316d90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.597616] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4474e61b-0664-40f7-a8ec-be3d14684b10 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.621196] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d1a12f05-6178-44bb-9eb0-b52d806fe91d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.635042] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 65d39cb0-8eed-49e2-a854-032d527cd0e8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.647388] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bc074836-1520-44cf-aae0-acbfaa7a77e9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.662168] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.673680] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.687070] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e21efb06-821b-4bec-a6d9-f57ae59d038a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.698971] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c4512476-9905-4f33-8575-d0a0f24ed4d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.713271] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b619a949-11d4-4178-9424-54841ee6c26e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.723927] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ecd5e5ca-bcef-45ed-b7b5-fd572d80bd50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.736504] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 815.736919] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 815.737178] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 815.757833] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing inventories for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 815.773864] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating ProviderTree inventory for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 815.774109] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating inventory in ProviderTree for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 815.788094] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing aggregate associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, aggregates: None {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 815.811957] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing trait associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 815.905745] nova-compute[62208]: DEBUG nova.network.neutron [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Successfully created port: cb09ac96-41a8-4e20-99f3-3ef6d8c9f872 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 816.270616] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ead6506-4075-4381-85e1-6cb1e8db192e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.273484] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 816.273484] nova-compute[62208]: warnings.warn( [ 816.279930] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d36fdef-1ab8-47e5-96d5-6d48627216be {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.282442] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 816.282442] nova-compute[62208]: warnings.warn( [ 816.312202] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6a83c44-82d2-4940-8fbc-97cd967dd6e7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.313614] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 816.313614] nova-compute[62208]: warnings.warn( [ 816.320146] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb7297fe-9681-47b1-8f13-37db4ad2f389 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.324579] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 816.324579] nova-compute[62208]: warnings.warn( [ 816.336266] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 816.344573] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 816.361278] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 816.361480] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.968s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 816.974578] nova-compute[62208]: DEBUG nova.network.neutron [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Successfully updated port: cb09ac96-41a8-4e20-99f3-3ef6d8c9f872 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 816.986743] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Acquiring lock "refresh_cache-ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 816.986885] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Acquired lock "refresh_cache-ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 816.987015] nova-compute[62208]: DEBUG nova.network.neutron [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 817.094303] nova-compute[62208]: DEBUG nova.network.neutron [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 817.275975] nova-compute[62208]: DEBUG nova.network.neutron [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Updating instance_info_cache with network_info: [{"id": "cb09ac96-41a8-4e20-99f3-3ef6d8c9f872", "address": "fa:16:3e:84:9f:77", "network": {"id": "0b95e4ca-1dd2-43d1-a389-6ba559994378", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1147355681-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7b1b3e37008d4f8aa2e955b8f91b9489", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69351262-8d39-441a-85ba-3a78df436d17", "external-id": "nsx-vlan-transportzone-205", "segmentation_id": 205, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcb09ac96-41", "ovs_interfaceid": "cb09ac96-41a8-4e20-99f3-3ef6d8c9f872", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 817.293633] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Releasing lock "refresh_cache-ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 817.294027] nova-compute[62208]: DEBUG nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Instance network_info: |[{"id": "cb09ac96-41a8-4e20-99f3-3ef6d8c9f872", "address": "fa:16:3e:84:9f:77", "network": {"id": "0b95e4ca-1dd2-43d1-a389-6ba559994378", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1147355681-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7b1b3e37008d4f8aa2e955b8f91b9489", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69351262-8d39-441a-85ba-3a78df436d17", "external-id": "nsx-vlan-transportzone-205", "segmentation_id": 205, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcb09ac96-41", "ovs_interfaceid": "cb09ac96-41a8-4e20-99f3-3ef6d8c9f872", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 817.294979] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:84:9f:77', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '69351262-8d39-441a-85ba-3a78df436d17', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cb09ac96-41a8-4e20-99f3-3ef6d8c9f872', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 817.304985] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Creating folder: Project (7b1b3e37008d4f8aa2e955b8f91b9489). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 817.305843] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1055aa78-8b56-41e0-9031-7a330bb5c308 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.308653] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 817.308653] nova-compute[62208]: warnings.warn( [ 817.321471] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Created folder: Project (7b1b3e37008d4f8aa2e955b8f91b9489) in parent group-v17427. [ 817.321691] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Creating folder: Instances. Parent ref: group-v17489. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 817.321925] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c38a5eb8-a476-4f9f-880c-84a2801462dc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.323591] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 817.323591] nova-compute[62208]: warnings.warn( [ 817.338169] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Created folder: Instances in parent group-v17489. [ 817.338169] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 817.338169] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 817.338169] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-15e3115c-494a-4de9-90a7-31cc378bad83 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.359683] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 817.359683] nova-compute[62208]: warnings.warn( [ 817.360620] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 817.361005] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 817.366390] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 817.366390] nova-compute[62208]: value = "task-38473" [ 817.366390] nova-compute[62208]: _type = "Task" [ 817.366390] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 817.370424] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 817.370424] nova-compute[62208]: warnings.warn( [ 817.377922] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38473, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 817.671410] nova-compute[62208]: DEBUG nova.compute.manager [req-594e7f9c-f8ef-4b68-b309-4d4fa937f709 req-f3059b4b-6d31-4366-8afb-3a51aae70785 service nova] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Received event network-vif-plugged-cb09ac96-41a8-4e20-99f3-3ef6d8c9f872 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 817.671630] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-594e7f9c-f8ef-4b68-b309-4d4fa937f709 req-f3059b4b-6d31-4366-8afb-3a51aae70785 service nova] Acquiring lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 817.671868] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-594e7f9c-f8ef-4b68-b309-4d4fa937f709 req-f3059b4b-6d31-4366-8afb-3a51aae70785 service nova] Lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 817.672113] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-594e7f9c-f8ef-4b68-b309-4d4fa937f709 req-f3059b4b-6d31-4366-8afb-3a51aae70785 service nova] Lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 817.672283] nova-compute[62208]: DEBUG nova.compute.manager [req-594e7f9c-f8ef-4b68-b309-4d4fa937f709 req-f3059b4b-6d31-4366-8afb-3a51aae70785 service nova] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] No waiting events found dispatching network-vif-plugged-cb09ac96-41a8-4e20-99f3-3ef6d8c9f872 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 817.672500] nova-compute[62208]: WARNING nova.compute.manager [req-594e7f9c-f8ef-4b68-b309-4d4fa937f709 req-f3059b4b-6d31-4366-8afb-3a51aae70785 service nova] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Received unexpected event network-vif-plugged-cb09ac96-41a8-4e20-99f3-3ef6d8c9f872 for instance with vm_state building and task_state spawning. [ 817.871158] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 817.871158] nova-compute[62208]: warnings.warn( [ 817.877792] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38473, 'name': CreateVM_Task, 'duration_secs': 0.359444} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 817.878021] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 817.878594] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 817.878820] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 817.881730] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55241c7f-e9e9-4293-b792-f40c4e55b542 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.891978] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 817.891978] nova-compute[62208]: warnings.warn( [ 817.923623] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Reconfiguring VM instance to enable vnc on port - 5904 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 817.924083] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-6047063f-a110-4538-a019-a8101f42d600 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.934435] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 817.934435] nova-compute[62208]: warnings.warn( [ 817.941330] nova-compute[62208]: DEBUG oslo_vmware.api [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Waiting for the task: (returnval){ [ 817.941330] nova-compute[62208]: value = "task-38474" [ 817.941330] nova-compute[62208]: _type = "Task" [ 817.941330] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 817.944368] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 817.944368] nova-compute[62208]: warnings.warn( [ 817.950979] nova-compute[62208]: DEBUG oslo_vmware.api [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Task: {'id': task-38474, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 818.445852] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 818.445852] nova-compute[62208]: warnings.warn( [ 818.451935] nova-compute[62208]: DEBUG oslo_vmware.api [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Task: {'id': task-38474, 'name': ReconfigVM_Task, 'duration_secs': 0.111798} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 818.452293] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Reconfigured VM instance to enable vnc on port - 5904 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 818.452652] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.574s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 818.452727] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 818.452856] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 818.453209] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 818.453572] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8aba335c-7a7f-4713-8602-da769c7b6a3b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.455092] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 818.455092] nova-compute[62208]: warnings.warn( [ 818.458500] nova-compute[62208]: DEBUG oslo_vmware.api [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Waiting for the task: (returnval){ [ 818.458500] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]521e3f7f-0a00-691c-3ba8-92c54bc0eb0f" [ 818.458500] nova-compute[62208]: _type = "Task" [ 818.458500] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 818.462134] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 818.462134] nova-compute[62208]: warnings.warn( [ 818.474845] nova-compute[62208]: DEBUG oslo_vmware.api [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]521e3f7f-0a00-691c-3ba8-92c54bc0eb0f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 818.963327] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 818.963327] nova-compute[62208]: warnings.warn( [ 818.970464] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 818.970776] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 818.971211] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 819.892844] nova-compute[62208]: DEBUG nova.compute.manager [req-3d673547-29fc-41ce-aa15-429bf661167b req-e214da91-b867-4e66-9d14-c879d860a771 service nova] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Received event network-changed-cb09ac96-41a8-4e20-99f3-3ef6d8c9f872 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 819.893091] nova-compute[62208]: DEBUG nova.compute.manager [req-3d673547-29fc-41ce-aa15-429bf661167b req-e214da91-b867-4e66-9d14-c879d860a771 service nova] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Refreshing instance network info cache due to event network-changed-cb09ac96-41a8-4e20-99f3-3ef6d8c9f872. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 819.893258] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-3d673547-29fc-41ce-aa15-429bf661167b req-e214da91-b867-4e66-9d14-c879d860a771 service nova] Acquiring lock "refresh_cache-ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 819.893399] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-3d673547-29fc-41ce-aa15-429bf661167b req-e214da91-b867-4e66-9d14-c879d860a771 service nova] Acquired lock "refresh_cache-ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 819.893838] nova-compute[62208]: DEBUG nova.network.neutron [req-3d673547-29fc-41ce-aa15-429bf661167b req-e214da91-b867-4e66-9d14-c879d860a771 service nova] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Refreshing network info cache for port cb09ac96-41a8-4e20-99f3-3ef6d8c9f872 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 820.261294] nova-compute[62208]: DEBUG nova.network.neutron [req-3d673547-29fc-41ce-aa15-429bf661167b req-e214da91-b867-4e66-9d14-c879d860a771 service nova] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Updated VIF entry in instance network info cache for port cb09ac96-41a8-4e20-99f3-3ef6d8c9f872. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 820.261294] nova-compute[62208]: DEBUG nova.network.neutron [req-3d673547-29fc-41ce-aa15-429bf661167b req-e214da91-b867-4e66-9d14-c879d860a771 service nova] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Updating instance_info_cache with network_info: [{"id": "cb09ac96-41a8-4e20-99f3-3ef6d8c9f872", "address": "fa:16:3e:84:9f:77", "network": {"id": "0b95e4ca-1dd2-43d1-a389-6ba559994378", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1147355681-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7b1b3e37008d4f8aa2e955b8f91b9489", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69351262-8d39-441a-85ba-3a78df436d17", "external-id": "nsx-vlan-transportzone-205", "segmentation_id": 205, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcb09ac96-41", "ovs_interfaceid": "cb09ac96-41a8-4e20-99f3-3ef6d8c9f872", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 820.270636] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-3d673547-29fc-41ce-aa15-429bf661167b req-e214da91-b867-4e66-9d14-c879d860a771 service nova] Releasing lock "refresh_cache-ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 821.769213] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-28272ba8-771e-4e85-8870-a1ad850a7bee tempest-InstanceActionsNegativeTestJSON-262890172 tempest-InstanceActionsNegativeTestJSON-262890172-project-member] Acquiring lock "805622b8-0c67-464e-a666-bd553818e796" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 821.769549] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-28272ba8-771e-4e85-8870-a1ad850a7bee tempest-InstanceActionsNegativeTestJSON-262890172 tempest-InstanceActionsNegativeTestJSON-262890172-project-member] Lock "805622b8-0c67-464e-a666-bd553818e796" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 822.172341] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1840cfda-292f-4d33-a376-7409cb1dda14 tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Acquiring lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 822.361547] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3fe8377c-a8f3-4bdb-9564-cb84ef68bf51 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "0b350d21-7644-49d1-a3d6-f2c069de2f0c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 822.361767] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3fe8377c-a8f3-4bdb-9564-cb84ef68bf51 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "0b350d21-7644-49d1-a3d6-f2c069de2f0c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 835.780933] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3a4e1cbd-0872-42c0-b5e9-db36baacbfc8 tempest-VolumesAdminNegativeTest-420285072 tempest-VolumesAdminNegativeTest-420285072-project-member] Acquiring lock "c3083789-9915-42dc-9345-22dabdbec850" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 835.781635] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3a4e1cbd-0872-42c0-b5e9-db36baacbfc8 tempest-VolumesAdminNegativeTest-420285072 tempest-VolumesAdminNegativeTest-420285072-project-member] Lock "c3083789-9915-42dc-9345-22dabdbec850" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 839.439765] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b1df3a2e-5adc-4a3b-893e-29e93f36e31f tempest-ServerTagsTestJSON-1796243761 tempest-ServerTagsTestJSON-1796243761-project-member] Acquiring lock "235a9ab6-3be0-4205-bdb9-8b85c93f0846" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 839.440092] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b1df3a2e-5adc-4a3b-893e-29e93f36e31f tempest-ServerTagsTestJSON-1796243761 tempest-ServerTagsTestJSON-1796243761-project-member] Lock "235a9ab6-3be0-4205-bdb9-8b85c93f0846" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 850.033623] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c0855b1f-d83f-4fc0-991c-cd87054ae271 tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] Acquiring lock "9db515f2-7484-4478-86e0-2e715e08646a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 850.033962] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c0855b1f-d83f-4fc0-991c-cd87054ae271 tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] Lock "9db515f2-7484-4478-86e0-2e715e08646a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 850.074673] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c0855b1f-d83f-4fc0-991c-cd87054ae271 tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] Acquiring lock "21ce7403-a26b-452f-948a-2e32c606ce00" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 850.074854] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c0855b1f-d83f-4fc0-991c-cd87054ae271 tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] Lock "21ce7403-a26b-452f-948a-2e32c606ce00" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 855.848684] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e2c38d5-9f6b-4929-b9be-c4a293024989 tempest-ServersTestManualDisk-1323072622 tempest-ServersTestManualDisk-1323072622-project-member] Acquiring lock "5672475f-45dd-460d-bb7b-f53dfee798b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 855.848684] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e2c38d5-9f6b-4929-b9be-c4a293024989 tempest-ServersTestManualDisk-1323072622 tempest-ServersTestManualDisk-1323072622-project-member] Lock "5672475f-45dd-460d-bb7b-f53dfee798b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 856.688398] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 856.688398] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 856.688398] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 856.688398] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 856.688398] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 856.688398] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 856.688398] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 856.688398] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 856.688398] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 856.688398] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 856.688398] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 856.688398] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 856.688398] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/10d9f480-2d0f-4e17-bdcd-022e38abcc48/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 856.689237] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 856.689499] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Copying Virtual Disk [datastore2] vmware_temp/10d9f480-2d0f-4e17-bdcd-022e38abcc48/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/10d9f480-2d0f-4e17-bdcd-022e38abcc48/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 856.689814] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8ff19b15-ab29-4e73-89bf-ee11b7f1032c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 856.692366] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 856.692366] nova-compute[62208]: warnings.warn( [ 856.698454] nova-compute[62208]: DEBUG oslo_vmware.api [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Waiting for the task: (returnval){ [ 856.698454] nova-compute[62208]: value = "task-38488" [ 856.698454] nova-compute[62208]: _type = "Task" [ 856.698454] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 856.701841] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 856.701841] nova-compute[62208]: warnings.warn( [ 856.706978] nova-compute[62208]: DEBUG oslo_vmware.api [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Task: {'id': task-38488, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 857.202556] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.202556] nova-compute[62208]: warnings.warn( [ 857.208895] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 857.208895] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 857.209426] nova-compute[62208]: ERROR nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 857.209426] nova-compute[62208]: Faults: ['InvalidArgument'] [ 857.209426] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Traceback (most recent call last): [ 857.209426] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 857.209426] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] yield resources [ 857.209426] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 857.209426] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] self.driver.spawn(context, instance, image_meta, [ 857.209426] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 857.209426] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] self._vmops.spawn(context, instance, image_meta, injected_files, [ 857.209426] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 857.209426] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] self._fetch_image_if_missing(context, vi) [ 857.209426] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 857.209786] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] image_cache(vi, tmp_image_ds_loc) [ 857.209786] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 857.209786] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] vm_util.copy_virtual_disk( [ 857.209786] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 857.209786] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] session._wait_for_task(vmdk_copy_task) [ 857.209786] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 857.209786] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] return self.wait_for_task(task_ref) [ 857.209786] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 857.209786] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] return evt.wait() [ 857.209786] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 857.209786] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] result = hub.switch() [ 857.209786] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 857.209786] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] return self.greenlet.switch() [ 857.210212] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 857.210212] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] self.f(*self.args, **self.kw) [ 857.210212] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 857.210212] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] raise exceptions.translate_fault(task_info.error) [ 857.210212] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 857.210212] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Faults: ['InvalidArgument'] [ 857.210212] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] [ 857.210212] nova-compute[62208]: INFO nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Terminating instance [ 857.211430] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 857.211631] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 857.212165] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquiring lock "refresh_cache-75fd2a2c-4ef5-4b42-b309-53cff148c772" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 857.213323] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquired lock "refresh_cache-75fd2a2c-4ef5-4b42-b309-53cff148c772" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 857.213323] nova-compute[62208]: DEBUG nova.network.neutron [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 857.213612] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-69920c29-7799-41da-b53e-9ea17a27027d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.215671] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.215671] nova-compute[62208]: warnings.warn( [ 857.223464] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 857.223663] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 857.224811] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1d0bc7cd-d1c8-428c-8b00-35686731bcd7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.228869] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.228869] nova-compute[62208]: warnings.warn( [ 857.234974] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Waiting for the task: (returnval){ [ 857.234974] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5286d89a-25e3-0b12-c8b4-c57223c887cc" [ 857.234974] nova-compute[62208]: _type = "Task" [ 857.234974] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 857.236727] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.236727] nova-compute[62208]: warnings.warn( [ 857.242808] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5286d89a-25e3-0b12-c8b4-c57223c887cc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 857.243733] nova-compute[62208]: DEBUG nova.network.neutron [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 857.279198] nova-compute[62208]: DEBUG nova.network.neutron [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 857.289902] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Releasing lock "refresh_cache-75fd2a2c-4ef5-4b42-b309-53cff148c772" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 857.290341] nova-compute[62208]: DEBUG nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 857.290532] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 857.291734] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5b38a19-7b8d-45e0-819f-c08640c4f1bc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.295124] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.295124] nova-compute[62208]: warnings.warn( [ 857.300673] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 857.300949] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ddcf69df-0e16-41dc-b45c-af00bfe2ed57 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.302519] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.302519] nova-compute[62208]: warnings.warn( [ 857.337255] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 857.337691] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 857.337920] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Deleting the datastore file [datastore2] 75fd2a2c-4ef5-4b42-b309-53cff148c772 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 857.338332] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-14236859-d772-4af7-9322-b147ec728d57 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.341456] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.341456] nova-compute[62208]: warnings.warn( [ 857.349357] nova-compute[62208]: DEBUG oslo_vmware.api [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Waiting for the task: (returnval){ [ 857.349357] nova-compute[62208]: value = "task-38490" [ 857.349357] nova-compute[62208]: _type = "Task" [ 857.349357] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 857.354199] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.354199] nova-compute[62208]: warnings.warn( [ 857.365380] nova-compute[62208]: DEBUG oslo_vmware.api [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Task: {'id': task-38490, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 857.738150] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.738150] nova-compute[62208]: warnings.warn( [ 857.753477] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 857.753983] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Creating directory with path [datastore2] vmware_temp/020070ad-b8df-4440-9dad-42e592a196d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 857.754348] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0417d8d3-83ff-41ba-8327-3e925254fc07 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.756992] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.756992] nova-compute[62208]: warnings.warn( [ 857.788959] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Created directory with path [datastore2] vmware_temp/020070ad-b8df-4440-9dad-42e592a196d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 857.789188] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Fetch image to [datastore2] vmware_temp/020070ad-b8df-4440-9dad-42e592a196d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 857.789359] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/020070ad-b8df-4440-9dad-42e592a196d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 857.790270] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25aed9d8-4db2-48bc-beb8-d31e2720f021 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.792916] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.792916] nova-compute[62208]: warnings.warn( [ 857.808933] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db2abbc7-7e82-4b93-bf5a-93dab44d9344 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.811893] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.811893] nova-compute[62208]: warnings.warn( [ 857.819636] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7f6a881-bed9-47c7-bf0c-378ce40c7fb9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.823378] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.823378] nova-compute[62208]: warnings.warn( [ 857.853936] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fedd89a3-cc12-43ce-8764-81a44d4185b6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.856269] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.856269] nova-compute[62208]: warnings.warn( [ 857.856624] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.856624] nova-compute[62208]: warnings.warn( [ 857.863911] nova-compute[62208]: DEBUG oslo_vmware.api [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Task: {'id': task-38490, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.146336} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 857.865369] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 857.865646] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 857.865745] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 857.865912] nova-compute[62208]: INFO nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Took 0.58 seconds to destroy the instance on the hypervisor. [ 857.866157] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 857.866642] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 857.866746] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 857.868513] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c0e36559-d424-4b7f-a549-d8aa29449332 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.870794] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 857.870794] nova-compute[62208]: warnings.warn( [ 857.903970] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 857.907416] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 857.921021] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 857.936167] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Took 0.07 seconds to deallocate network for instance. [ 857.939068] nova-compute[62208]: DEBUG nova.compute.claims [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Aborting claim: <nova.compute.claims.Claim object at 0x7fb937495660> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 857.939629] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 857.939965] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 857.989472] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/020070ad-b8df-4440-9dad-42e592a196d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 858.053020] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 858.053250] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/020070ad-b8df-4440-9dad-42e592a196d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 858.423721] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80ef31c6-53c4-4495-a3de-febc4e7dc24b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 858.426642] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 858.426642] nova-compute[62208]: warnings.warn( [ 858.432773] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1d9bf8f-fecb-46df-8f22-b2755563c417 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 858.436831] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 858.436831] nova-compute[62208]: warnings.warn( [ 858.469677] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e4baacd-b322-4c51-8ccf-1d360b09a801 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 858.472907] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 858.472907] nova-compute[62208]: warnings.warn( [ 858.480369] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5336f21f-8dd9-4e8e-89f5-74854db0a280 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 858.484546] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 858.484546] nova-compute[62208]: warnings.warn( [ 858.496262] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 858.506584] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 858.523861] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.584s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 858.524431] nova-compute[62208]: ERROR nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 858.524431] nova-compute[62208]: Faults: ['InvalidArgument'] [ 858.524431] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Traceback (most recent call last): [ 858.524431] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 858.524431] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] self.driver.spawn(context, instance, image_meta, [ 858.524431] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 858.524431] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] self._vmops.spawn(context, instance, image_meta, injected_files, [ 858.524431] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 858.524431] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] self._fetch_image_if_missing(context, vi) [ 858.524431] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 858.524431] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] image_cache(vi, tmp_image_ds_loc) [ 858.524431] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 858.524843] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] vm_util.copy_virtual_disk( [ 858.524843] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 858.524843] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] session._wait_for_task(vmdk_copy_task) [ 858.524843] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 858.524843] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] return self.wait_for_task(task_ref) [ 858.524843] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 858.524843] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] return evt.wait() [ 858.524843] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 858.524843] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] result = hub.switch() [ 858.524843] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 858.524843] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] return self.greenlet.switch() [ 858.524843] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 858.524843] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] self.f(*self.args, **self.kw) [ 858.525248] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 858.525248] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] raise exceptions.translate_fault(task_info.error) [ 858.525248] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 858.525248] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Faults: ['InvalidArgument'] [ 858.525248] nova-compute[62208]: ERROR nova.compute.manager [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] [ 858.525248] nova-compute[62208]: DEBUG nova.compute.utils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 858.526762] nova-compute[62208]: DEBUG nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Build of instance 75fd2a2c-4ef5-4b42-b309-53cff148c772 was re-scheduled: A specified parameter was not correct: fileType [ 858.526762] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 858.527169] nova-compute[62208]: DEBUG nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 858.527396] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquiring lock "refresh_cache-75fd2a2c-4ef5-4b42-b309-53cff148c772" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 858.527544] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquired lock "refresh_cache-75fd2a2c-4ef5-4b42-b309-53cff148c772" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 858.527770] nova-compute[62208]: DEBUG nova.network.neutron [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 858.574477] nova-compute[62208]: DEBUG nova.network.neutron [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 858.673224] nova-compute[62208]: DEBUG nova.network.neutron [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 858.686251] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Releasing lock "refresh_cache-75fd2a2c-4ef5-4b42-b309-53cff148c772" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 858.686251] nova-compute[62208]: DEBUG nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 858.686251] nova-compute[62208]: DEBUG nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 858.686251] nova-compute[62208]: DEBUG nova.network.neutron [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 858.724049] nova-compute[62208]: DEBUG nova.network.neutron [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 858.732603] nova-compute[62208]: DEBUG nova.network.neutron [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 858.741682] nova-compute[62208]: INFO nova.compute.manager [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Took 0.06 seconds to deallocate network for instance. [ 858.851509] nova-compute[62208]: INFO nova.scheduler.client.report [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Deleted allocations for instance 75fd2a2c-4ef5-4b42-b309-53cff148c772 [ 858.878492] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-38851c7c-106e-4e57-9b79-1c38b0c4c049 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Lock "75fd2a2c-4ef5-4b42-b309-53cff148c772" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 278.582s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 858.879773] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Lock "75fd2a2c-4ef5-4b42-b309-53cff148c772" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 79.468s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 858.879984] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquiring lock "75fd2a2c-4ef5-4b42-b309-53cff148c772-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 858.880211] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Lock "75fd2a2c-4ef5-4b42-b309-53cff148c772-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 858.880371] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Lock "75fd2a2c-4ef5-4b42-b309-53cff148c772-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 858.882995] nova-compute[62208]: INFO nova.compute.manager [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Terminating instance [ 858.884618] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquiring lock "refresh_cache-75fd2a2c-4ef5-4b42-b309-53cff148c772" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 858.884699] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Acquired lock "refresh_cache-75fd2a2c-4ef5-4b42-b309-53cff148c772" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 858.884851] nova-compute[62208]: DEBUG nova.network.neutron [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 858.901212] nova-compute[62208]: DEBUG nova.compute.manager [None req-bc94d813-3d3c-40ce-8102-b6786b8ce74d tempest-ListImageFiltersTestJSON-818600066 tempest-ListImageFiltersTestJSON-818600066-project-member] [instance: 61e911d7-b8e9-416e-b73c-574768744974] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 858.934947] nova-compute[62208]: DEBUG nova.compute.manager [None req-bc94d813-3d3c-40ce-8102-b6786b8ce74d tempest-ListImageFiltersTestJSON-818600066 tempest-ListImageFiltersTestJSON-818600066-project-member] [instance: 61e911d7-b8e9-416e-b73c-574768744974] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 858.969446] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-bc94d813-3d3c-40ce-8102-b6786b8ce74d tempest-ListImageFiltersTestJSON-818600066 tempest-ListImageFiltersTestJSON-818600066-project-member] Lock "61e911d7-b8e9-416e-b73c-574768744974" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 226.698s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 858.982740] nova-compute[62208]: DEBUG nova.compute.manager [None req-4487296c-1b8c-4d90-abc0-e7ad236d5263 tempest-ListImageFiltersTestJSON-818600066 tempest-ListImageFiltersTestJSON-818600066-project-member] [instance: f5f7e84c-2d39-4929-be15-e7c03fae4319] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 858.988791] nova-compute[62208]: DEBUG nova.network.neutron [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 859.018038] nova-compute[62208]: DEBUG nova.compute.manager [None req-4487296c-1b8c-4d90-abc0-e7ad236d5263 tempest-ListImageFiltersTestJSON-818600066 tempest-ListImageFiltersTestJSON-818600066-project-member] [instance: f5f7e84c-2d39-4929-be15-e7c03fae4319] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 859.050526] nova-compute[62208]: DEBUG nova.network.neutron [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 859.054783] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4487296c-1b8c-4d90-abc0-e7ad236d5263 tempest-ListImageFiltersTestJSON-818600066 tempest-ListImageFiltersTestJSON-818600066-project-member] Lock "f5f7e84c-2d39-4929-be15-e7c03fae4319" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 225.430s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 859.072684] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Releasing lock "refresh_cache-75fd2a2c-4ef5-4b42-b309-53cff148c772" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 859.073119] nova-compute[62208]: DEBUG nova.compute.manager [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 859.073311] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 859.073666] nova-compute[62208]: DEBUG nova.compute.manager [None req-c0227e88-29e7-4841-981b-4a5deb2ddf23 tempest-ServerActionsTestOtherA-1944985306 tempest-ServerActionsTestOtherA-1944985306-project-member] [instance: d4e0170a-0993-4f7f-a7fa-6539bb13a082] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 859.076320] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a67164b0-8ac7-47fe-b1e3-339cef94f80b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 859.078579] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 859.078579] nova-compute[62208]: warnings.warn( [ 859.086950] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dd441c7-e065-4739-b22b-f8d0e937bff5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 859.099588] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 859.099588] nova-compute[62208]: warnings.warn( [ 859.118805] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 75fd2a2c-4ef5-4b42-b309-53cff148c772 could not be found. [ 859.119110] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 859.119446] nova-compute[62208]: INFO nova.compute.manager [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Took 0.05 seconds to destroy the instance on the hypervisor. [ 859.119658] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 859.120542] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 859.120701] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 859.122846] nova-compute[62208]: DEBUG nova.compute.manager [None req-c0227e88-29e7-4841-981b-4a5deb2ddf23 tempest-ServerActionsTestOtherA-1944985306 tempest-ServerActionsTestOtherA-1944985306-project-member] [instance: d4e0170a-0993-4f7f-a7fa-6539bb13a082] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 859.153259] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c0227e88-29e7-4841-981b-4a5deb2ddf23 tempest-ServerActionsTestOtherA-1944985306 tempest-ServerActionsTestOtherA-1944985306-project-member] Lock "d4e0170a-0993-4f7f-a7fa-6539bb13a082" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 224.501s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 859.165514] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 859.168130] nova-compute[62208]: DEBUG nova.compute.manager [None req-ad115aa5-fb16-42af-b099-d0e79a132f92 tempest-VolumesAdminNegativeTest-420285072 tempest-VolumesAdminNegativeTest-420285072-project-member] [instance: 2681fbe1-7ed8-4280-95ac-f98063278b52] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 859.173538] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 859.182454] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 75fd2a2c-4ef5-4b42-b309-53cff148c772] Took 0.06 seconds to deallocate network for instance. [ 859.215026] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-89623a71-b86c-40a6-ba07-065a79f32e60 tempest-ServerActionsV293TestJSON-2076507366 tempest-ServerActionsV293TestJSON-2076507366-project-member] Acquiring lock "61cc043c-1d4a-4a47-86b0-cc4fb61abed4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 859.215470] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-89623a71-b86c-40a6-ba07-065a79f32e60 tempest-ServerActionsV293TestJSON-2076507366 tempest-ServerActionsV293TestJSON-2076507366-project-member] Lock "61cc043c-1d4a-4a47-86b0-cc4fb61abed4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 859.215738] nova-compute[62208]: DEBUG nova.compute.manager [None req-ad115aa5-fb16-42af-b099-d0e79a132f92 tempest-VolumesAdminNegativeTest-420285072 tempest-VolumesAdminNegativeTest-420285072-project-member] [instance: 2681fbe1-7ed8-4280-95ac-f98063278b52] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 859.263083] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ad115aa5-fb16-42af-b099-d0e79a132f92 tempest-VolumesAdminNegativeTest-420285072 tempest-VolumesAdminNegativeTest-420285072-project-member] Lock "2681fbe1-7ed8-4280-95ac-f98063278b52" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 222.103s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 859.279424] nova-compute[62208]: DEBUG nova.compute.manager [None req-7ff5b5ec-d8a8-482a-9151-921c71086c10 tempest-AttachInterfacesUnderV243Test-662403438 tempest-AttachInterfacesUnderV243Test-662403438-project-member] [instance: 3be03d93-aae7-4312-832f-5a61b49753bb] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 859.319474] nova-compute[62208]: DEBUG nova.compute.manager [None req-7ff5b5ec-d8a8-482a-9151-921c71086c10 tempest-AttachInterfacesUnderV243Test-662403438 tempest-AttachInterfacesUnderV243Test-662403438-project-member] [instance: 3be03d93-aae7-4312-832f-5a61b49753bb] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 859.327229] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f2e409ff-f27b-4ebb-9590-91e1418cbe65 tempest-ServersAdminNegativeTestJSON-1882625037 tempest-ServersAdminNegativeTestJSON-1882625037-project-member] Lock "75fd2a2c-4ef5-4b42-b309-53cff148c772" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.447s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 859.380895] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7ff5b5ec-d8a8-482a-9151-921c71086c10 tempest-AttachInterfacesUnderV243Test-662403438 tempest-AttachInterfacesUnderV243Test-662403438-project-member] Lock "3be03d93-aae7-4312-832f-5a61b49753bb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 216.069s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 859.393607] nova-compute[62208]: DEBUG nova.compute.manager [None req-91f2127d-27e2-401b-be3d-e4848189e2e1 tempest-AttachInterfacesV270Test-1164794761 tempest-AttachInterfacesV270Test-1164794761-project-member] [instance: 78471258-10a4-42e2-8d2a-f30b2baaa5d9] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 859.426777] nova-compute[62208]: DEBUG nova.compute.manager [None req-91f2127d-27e2-401b-be3d-e4848189e2e1 tempest-AttachInterfacesV270Test-1164794761 tempest-AttachInterfacesV270Test-1164794761-project-member] [instance: 78471258-10a4-42e2-8d2a-f30b2baaa5d9] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 859.453239] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-91f2127d-27e2-401b-be3d-e4848189e2e1 tempest-AttachInterfacesV270Test-1164794761 tempest-AttachInterfacesV270Test-1164794761-project-member] Lock "78471258-10a4-42e2-8d2a-f30b2baaa5d9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 214.473s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 859.465255] nova-compute[62208]: DEBUG nova.compute.manager [None req-abb41f28-a102-4838-9aef-3889631c6135 tempest-AttachSCSIVolumeTestJSON-1879792403 tempest-AttachSCSIVolumeTestJSON-1879792403-project-member] [instance: 5d6be180-d89f-44ba-847e-0ea169316d90] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 859.494764] nova-compute[62208]: DEBUG nova.compute.manager [None req-abb41f28-a102-4838-9aef-3889631c6135 tempest-AttachSCSIVolumeTestJSON-1879792403 tempest-AttachSCSIVolumeTestJSON-1879792403-project-member] [instance: 5d6be180-d89f-44ba-847e-0ea169316d90] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 859.530777] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-abb41f28-a102-4838-9aef-3889631c6135 tempest-AttachSCSIVolumeTestJSON-1879792403 tempest-AttachSCSIVolumeTestJSON-1879792403-project-member] Lock "5d6be180-d89f-44ba-847e-0ea169316d90" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 214.517s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 859.551040] nova-compute[62208]: DEBUG nova.compute.manager [None req-7fdd710c-95ab-4a98-8cfb-f11c3ef229ad tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] [instance: 4474e61b-0664-40f7-a8ec-be3d14684b10] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 859.595454] nova-compute[62208]: DEBUG nova.compute.manager [None req-7fdd710c-95ab-4a98-8cfb-f11c3ef229ad tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] [instance: 4474e61b-0664-40f7-a8ec-be3d14684b10] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 859.622718] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7fdd710c-95ab-4a98-8cfb-f11c3ef229ad tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] Lock "4474e61b-0664-40f7-a8ec-be3d14684b10" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 207.520s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 859.637890] nova-compute[62208]: DEBUG nova.compute.manager [None req-7fdd710c-95ab-4a98-8cfb-f11c3ef229ad tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] [instance: d1a12f05-6178-44bb-9eb0-b52d806fe91d] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 859.679997] nova-compute[62208]: DEBUG nova.compute.manager [None req-7fdd710c-95ab-4a98-8cfb-f11c3ef229ad tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] [instance: d1a12f05-6178-44bb-9eb0-b52d806fe91d] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 859.710292] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7fdd710c-95ab-4a98-8cfb-f11c3ef229ad tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] Lock "d1a12f05-6178-44bb-9eb0-b52d806fe91d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 207.557s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 859.729442] nova-compute[62208]: DEBUG nova.compute.manager [None req-271bbde6-e4a4-4e8c-becb-2d1755d473a8 tempest-ServerAddressesNegativeTestJSON-1677873053 tempest-ServerAddressesNegativeTestJSON-1677873053-project-member] [instance: 65d39cb0-8eed-49e2-a854-032d527cd0e8] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 859.768308] nova-compute[62208]: DEBUG nova.compute.manager [None req-271bbde6-e4a4-4e8c-becb-2d1755d473a8 tempest-ServerAddressesNegativeTestJSON-1677873053 tempest-ServerAddressesNegativeTestJSON-1677873053-project-member] [instance: 65d39cb0-8eed-49e2-a854-032d527cd0e8] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 859.796402] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-271bbde6-e4a4-4e8c-becb-2d1755d473a8 tempest-ServerAddressesNegativeTestJSON-1677873053 tempest-ServerAddressesNegativeTestJSON-1677873053-project-member] Lock "65d39cb0-8eed-49e2-a854-032d527cd0e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 201.750s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 859.812993] nova-compute[62208]: DEBUG nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 859.871016] nova-compute[62208]: ERROR nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Attempt to boot guest with tagged devices on host that does not support tagging.: nova.exception.BuildAbortException: Attempt to boot guest with tagged devices on host that does not support tagging. [ 859.871201] nova-compute[62208]: DEBUG nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 859.871418] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] Acquiring lock "refresh_cache-bc074836-1520-44cf-aae0-acbfaa7a77e9" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 859.871557] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] Acquired lock "refresh_cache-bc074836-1520-44cf-aae0-acbfaa7a77e9" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 859.871718] nova-compute[62208]: DEBUG nova.network.neutron [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 859.931807] nova-compute[62208]: DEBUG nova.network.neutron [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 860.156533] nova-compute[62208]: DEBUG nova.network.neutron [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 860.167701] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] Releasing lock "refresh_cache-bc074836-1520-44cf-aae0-acbfaa7a77e9" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 860.167983] nova-compute[62208]: DEBUG nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 860.168198] nova-compute[62208]: DEBUG nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 860.168357] nova-compute[62208]: DEBUG nova.network.neutron [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 860.419576] nova-compute[62208]: DEBUG nova.network.neutron [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 861.577642] nova-compute[62208]: DEBUG nova.network.neutron [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 861.589131] nova-compute[62208]: INFO nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Took 1.42 seconds to deallocate network for instance. [ 861.642864] nova-compute[62208]: DEBUG nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Detaching volume: 86a45fb8-b4e6-4a89-b052-b43bdf7b7a63 {{(pid=62208) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3225}} [ 861.643094] nova-compute[62208]: INFO nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Detaching volume 86a45fb8-b4e6-4a89-b052-b43bdf7b7a63 [ 861.701657] nova-compute[62208]: DEBUG nova.virt.block_device [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Skipping driver_detach during remote rebuild. {{(pid=62208) _do_detach /opt/stack/nova/nova/virt/block_device.py:461}} [ 861.761655] nova-compute[62208]: WARNING nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] Failed to detach volume: 86a45fb8-b4e6-4a89-b052-b43bdf7b7a63 due to 'NoneType' object has no attribute 'devices' [ 861.761902] nova-compute[62208]: DEBUG nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Detaching volume: bfe750fb-8fa7-475c-9dde-426ea3151445 {{(pid=62208) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3225}} [ 861.762094] nova-compute[62208]: INFO nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Detaching volume bfe750fb-8fa7-475c-9dde-426ea3151445 [ 861.802070] nova-compute[62208]: DEBUG nova.virt.block_device [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Skipping driver_detach during remote rebuild. {{(pid=62208) _do_detach /opt/stack/nova/nova/virt/block_device.py:461}} [ 861.878365] nova-compute[62208]: WARNING nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] Failed to detach volume: bfe750fb-8fa7-475c-9dde-426ea3151445 due to 'NoneType' object has no attribute 'devices' [ 861.878585] nova-compute[62208]: DEBUG nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Detaching volume: 87ce587b-1fd4-4549-9135-e6d0b9e7aff3 {{(pid=62208) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3225}} [ 861.878800] nova-compute[62208]: INFO nova.compute.manager [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Detaching volume 87ce587b-1fd4-4549-9135-e6d0b9e7aff3 [ 861.916373] nova-compute[62208]: DEBUG nova.virt.block_device [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] [instance: bc074836-1520-44cf-aae0-acbfaa7a77e9] Skipping driver_detach during remote rebuild. {{(pid=62208) _do_detach /opt/stack/nova/nova/virt/block_device.py:461}} [ 862.069230] nova-compute[62208]: INFO nova.scheduler.client.report [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] Deleted allocations for instance bc074836-1520-44cf-aae0-acbfaa7a77e9 [ 862.069230] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-de6c3cf5-4aab-423b-897c-210175773ff3 tempest-TaggedBootDevicesTest_v242-1001415429 tempest-TaggedBootDevicesTest_v242-1001415429-project-member] Lock "bc074836-1520-44cf-aae0-acbfaa7a77e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 195.311s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 862.079373] nova-compute[62208]: DEBUG nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 862.138429] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 862.138693] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 862.140247] nova-compute[62208]: INFO nova.compute.claims [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 862.579447] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fd68bd2-0359-466e-8148-03aa683819ca {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.582235] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 862.582235] nova-compute[62208]: warnings.warn( [ 862.587464] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6859d07-04dc-4fc3-9867-e7c5d62e0dcf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.590778] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 862.590778] nova-compute[62208]: warnings.warn( [ 862.617665] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0a85ddb-2485-4332-857d-07fd93244c53 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.620176] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 862.620176] nova-compute[62208]: warnings.warn( [ 862.626654] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bded31fe-d542-4e0d-8d3d-f48b6d212f7d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.630481] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 862.630481] nova-compute[62208]: warnings.warn( [ 862.641211] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 862.650461] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 862.673719] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.529s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 862.673719] nova-compute[62208]: DEBUG nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 862.713728] nova-compute[62208]: DEBUG nova.compute.utils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 862.716179] nova-compute[62208]: DEBUG nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 862.716395] nova-compute[62208]: DEBUG nova.network.neutron [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 862.733295] nova-compute[62208]: DEBUG nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 862.780033] nova-compute[62208]: DEBUG nova.network.neutron [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] No network configured {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1188}} [ 862.780033] nova-compute[62208]: DEBUG nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Instance network_info: |[]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 862.815087] nova-compute[62208]: DEBUG nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 862.837324] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 862.837623] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 862.837780] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 862.837967] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 862.838108] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 862.838256] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 862.838475] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 862.840481] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 862.840481] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 862.840481] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 862.840481] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 862.840481] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-748c3822-aafa-44f1-9278-4b212897adad {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.842722] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 862.842722] nova-compute[62208]: warnings.warn( [ 862.848519] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cf33668-16ad-4f55-b16d-8da35360f132 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.853406] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 862.853406] nova-compute[62208]: warnings.warn( [ 862.867873] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 862.871061] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Creating folder: Project (6119b7529ef940eeaa910ac394663f47). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 862.871458] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1231d6c1-8e7a-4c63-8e10-f112fd673a7b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.873452] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 862.873452] nova-compute[62208]: warnings.warn( [ 862.883518] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Created folder: Project (6119b7529ef940eeaa910ac394663f47) in parent group-v17427. [ 862.883752] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Creating folder: Instances. Parent ref: group-v17499. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 862.884246] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-532886e2-1f2b-4fad-b4fe-1269f6cece7a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.886215] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 862.886215] nova-compute[62208]: warnings.warn( [ 862.896037] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Created folder: Instances in parent group-v17499. [ 862.896290] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 862.896487] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 862.896690] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ebf706a6-ec28-4c86-8ec1-82562b486dba {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.909180] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 862.909180] nova-compute[62208]: warnings.warn( [ 862.914735] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 862.914735] nova-compute[62208]: value = "task-38496" [ 862.914735] nova-compute[62208]: _type = "Task" [ 862.914735] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 862.918248] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 862.918248] nova-compute[62208]: warnings.warn( [ 862.923548] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38496, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 862.966262] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Acquiring lock "6e798882-aa11-4c1b-891c-7428d2bba113" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 862.966513] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Lock "6e798882-aa11-4c1b-891c-7428d2bba113" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 863.419077] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 863.419077] nova-compute[62208]: warnings.warn( [ 863.427323] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38496, 'name': CreateVM_Task, 'duration_secs': 0.253215} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 863.427504] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 863.427848] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 863.428112] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 863.430887] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9cfba77-abe8-4fc9-b6d7-8a30a7f20326 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.440962] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 863.440962] nova-compute[62208]: warnings.warn( [ 863.463062] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Reconfiguring VM instance to enable vnc on port - 5905 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 863.463504] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-2412c3e9-67cd-49bc-bd25-485a05053cbe {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.474289] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 863.474289] nova-compute[62208]: warnings.warn( [ 863.480201] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Waiting for the task: (returnval){ [ 863.480201] nova-compute[62208]: value = "task-38497" [ 863.480201] nova-compute[62208]: _type = "Task" [ 863.480201] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 863.485043] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 863.485043] nova-compute[62208]: warnings.warn( [ 863.490680] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Task: {'id': task-38497, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 863.984290] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 863.984290] nova-compute[62208]: warnings.warn( [ 863.990355] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Task: {'id': task-38497, 'name': ReconfigVM_Task, 'duration_secs': 0.111738} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 863.990528] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Reconfigured VM instance to enable vnc on port - 5905 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 863.990771] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.563s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 863.990963] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 863.991102] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 863.991445] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 863.991705] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3572ef1f-ad43-4300-b540-dded2b715641 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.993403] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 863.993403] nova-compute[62208]: warnings.warn( [ 863.996963] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Waiting for the task: (returnval){ [ 863.996963] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f6f47b-6f64-25c1-2697-6055a4b6f5db" [ 863.996963] nova-compute[62208]: _type = "Task" [ 863.996963] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 864.002080] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 864.002080] nova-compute[62208]: warnings.warn( [ 864.007007] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f6f47b-6f64-25c1-2697-6055a4b6f5db, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 864.502874] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 864.502874] nova-compute[62208]: warnings.warn( [ 864.510669] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 864.511113] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 864.512219] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 866.059141] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquiring lock "9b3e4e69-4f4c-48f3-957e-0cfd979b5712" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 868.574869] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "d5c7531e-b496-4aed-be05-f1a96391e327" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 868.575224] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "d5c7531e-b496-4aed-be05-f1a96391e327" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 874.141098] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 875.136307] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 875.159047] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 875.159661] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 875.159916] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 875.189690] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 875.190128] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 875.190403] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 875.190652] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 875.190919] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 875.191178] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 875.191423] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 875.191652] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 875.191902] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 875.192258] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 875.193466] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 875.194109] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 875.194537] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 875.194805] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 876.141761] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 877.141274] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 877.141453] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 877.141487] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 877.153801] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 877.154043] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 877.154192] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.154344] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 877.155715] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a22df86e-9a86-4cdf-a129-c21d4725c51f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.158422] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 877.158422] nova-compute[62208]: warnings.warn( [ 877.164812] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92fcc7b0-239e-45e5-ad6b-74036c69ea68 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.168660] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 877.168660] nova-compute[62208]: warnings.warn( [ 877.179658] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d40b6c29-1bcb-4f11-ab2c-2d82efa7939d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.184070] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 877.184070] nova-compute[62208]: warnings.warn( [ 877.189278] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fff825d-c143-4915-8aec-1f587892d0f3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.193029] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 877.193029] nova-compute[62208]: warnings.warn( [ 877.227549] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181928MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 877.228177] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 877.228507] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 877.305477] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4ca19153-519c-49e3-bdfd-1f5ea77b24a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 877.305653] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bd0eef47-56e8-45b6-92b1-e81400994572 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 877.305784] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9f48db49-1618-4b04-88a6-315c0f9b889a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 877.305908] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 858b585c-7746-4d38-84c9-b3ee719eb406 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 877.306027] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d2839fc7-a754-4d4f-a7f1-e4b17f2126a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 877.306146] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 08336643-4254-4447-b7c2-b81054bf9707 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 877.306262] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance af8885cb-afba-4724-be10-083e16f8bfc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 877.306376] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 877.306492] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 877.306610] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 877.320826] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.332277] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e21efb06-821b-4bec-a6d9-f57ae59d038a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.345180] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c4512476-9905-4f33-8575-d0a0f24ed4d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.358151] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b619a949-11d4-4178-9424-54841ee6c26e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.371336] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ecd5e5ca-bcef-45ed-b7b5-fd572d80bd50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.384314] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.396178] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 805622b8-0c67-464e-a666-bd553818e796 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.408815] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 0b350d21-7644-49d1-a3d6-f2c069de2f0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.423058] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c3083789-9915-42dc-9345-22dabdbec850 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.436397] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 235a9ab6-3be0-4205-bdb9-8b85c93f0846 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.447504] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9db515f2-7484-4478-86e0-2e715e08646a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.466405] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 21ce7403-a26b-452f-948a-2e32c606ce00 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.483698] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5672475f-45dd-460d-bb7b-f53dfee798b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.502155] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 61cc043c-1d4a-4a47-86b0-cc4fb61abed4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.515823] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 6e798882-aa11-4c1b-891c-7428d2bba113 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.528477] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d5c7531e-b496-4aed-be05-f1a96391e327 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 877.528784] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 877.528974] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 877.801800] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-88ff02c9-19db-4935-b3d7-6ef6f37cad7e tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] Acquiring lock "b03b2152-c17a-4757-b1ad-8e2fd1430fb3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 877.802101] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-88ff02c9-19db-4935-b3d7-6ef6f37cad7e tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] Lock "b03b2152-c17a-4757-b1ad-8e2fd1430fb3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 877.967595] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d77f903e-9c6b-4289-a8eb-a1d5cb568533 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.967595] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 877.967595] nova-compute[62208]: warnings.warn( [ 877.972667] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-846a5e2a-2d8d-4d1c-b90e-8b1069a1de7f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.976222] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 877.976222] nova-compute[62208]: warnings.warn( [ 878.003834] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a33677b-3a15-4df3-84a8-5d31e9dee867 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.006882] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 878.006882] nova-compute[62208]: warnings.warn( [ 878.012644] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1c6ba0a-02f8-4ed5-9a58-7c0116473f6a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.016677] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 878.016677] nova-compute[62208]: warnings.warn( [ 878.027001] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 878.037721] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 878.062542] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 878.062542] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.833s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.460430] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-afe46e49-843d-4946-a616-7292b5b12bdf tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] Acquiring lock "ab417ba8-5304-4dfd-a08e-102a43996d9f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 878.460732] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-afe46e49-843d-4946-a616-7292b5b12bdf tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] Lock "ab417ba8-5304-4dfd-a08e-102a43996d9f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 879.105465] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-523bc6ca-b005-4f1d-b6f4-68ce99a4391a tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] Acquiring lock "a2d62b77-29a3-4813-b4db-782e1aa52834" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 879.105947] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-523bc6ca-b005-4f1d-b6f4-68ce99a4391a tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] Lock "a2d62b77-29a3-4813-b4db-782e1aa52834" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 880.056088] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 880.922834] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-587d14de-0f25-47c3-ab7d-1b870a46dad9 tempest-TaggedBootDevicesTest-1956918411 tempest-TaggedBootDevicesTest-1956918411-project-member] Acquiring lock "3cd22f76-71d2-4d03-88c6-10192bb9418e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 880.923269] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-587d14de-0f25-47c3-ab7d-1b870a46dad9 tempest-TaggedBootDevicesTest-1956918411 tempest-TaggedBootDevicesTest-1956918411-project-member] Lock "3cd22f76-71d2-4d03-88c6-10192bb9418e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 905.313061] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 905.313061] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 905.313061] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 905.313061] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 905.313061] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 905.313061] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 905.313061] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 905.313061] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 905.313061] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 905.313061] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 905.313061] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 905.313061] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 905.313729] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/020070ad-b8df-4440-9dad-42e592a196d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 905.315167] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 905.315435] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Copying Virtual Disk [datastore2] vmware_temp/020070ad-b8df-4440-9dad-42e592a196d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/020070ad-b8df-4440-9dad-42e592a196d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 905.315757] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-607b8a53-2b7b-4533-9b3c-cb28f633d759 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.318121] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 905.318121] nova-compute[62208]: warnings.warn( [ 905.324946] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Waiting for the task: (returnval){ [ 905.324946] nova-compute[62208]: value = "task-38503" [ 905.324946] nova-compute[62208]: _type = "Task" [ 905.324946] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 905.328285] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 905.328285] nova-compute[62208]: warnings.warn( [ 905.333515] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Task: {'id': task-38503, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 905.829435] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 905.829435] nova-compute[62208]: warnings.warn( [ 905.835632] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 905.835952] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 905.836492] nova-compute[62208]: ERROR nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 905.836492] nova-compute[62208]: Faults: ['InvalidArgument'] [ 905.836492] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Traceback (most recent call last): [ 905.836492] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 905.836492] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] yield resources [ 905.836492] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 905.836492] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] self.driver.spawn(context, instance, image_meta, [ 905.836492] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 905.836492] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 905.836492] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 905.836492] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] self._fetch_image_if_missing(context, vi) [ 905.836492] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 905.836758] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] image_cache(vi, tmp_image_ds_loc) [ 905.836758] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 905.836758] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] vm_util.copy_virtual_disk( [ 905.836758] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 905.836758] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] session._wait_for_task(vmdk_copy_task) [ 905.836758] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 905.836758] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] return self.wait_for_task(task_ref) [ 905.836758] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 905.836758] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] return evt.wait() [ 905.836758] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 905.836758] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] result = hub.switch() [ 905.836758] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 905.836758] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] return self.greenlet.switch() [ 905.837082] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 905.837082] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] self.f(*self.args, **self.kw) [ 905.837082] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 905.837082] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] raise exceptions.translate_fault(task_info.error) [ 905.837082] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 905.837082] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Faults: ['InvalidArgument'] [ 905.837082] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] [ 905.837082] nova-compute[62208]: INFO nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Terminating instance [ 905.838389] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 905.838606] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 905.838869] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-40dd6561-23f1-479c-b23c-de9fb9c58443 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.841223] nova-compute[62208]: DEBUG nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 905.841413] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 905.842209] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e707c830-7851-43f3-a4e3-2563cc5dcd2d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.844603] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 905.844603] nova-compute[62208]: warnings.warn( [ 905.844965] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 905.844965] nova-compute[62208]: warnings.warn( [ 905.849303] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 905.849541] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b4b12fb7-593b-42d0-a560-70b023cb438a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.851902] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 905.852088] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 905.852659] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 905.852659] nova-compute[62208]: warnings.warn( [ 905.853042] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-15037cfc-6a45-4fdc-9f09-5f9d3b8527f5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.854919] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 905.854919] nova-compute[62208]: warnings.warn( [ 905.857760] nova-compute[62208]: DEBUG oslo_vmware.api [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Waiting for the task: (returnval){ [ 905.857760] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]529dbb34-8cb7-7b13-ba64-9ae4c421f653" [ 905.857760] nova-compute[62208]: _type = "Task" [ 905.857760] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 905.860968] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 905.860968] nova-compute[62208]: warnings.warn( [ 905.866838] nova-compute[62208]: DEBUG oslo_vmware.api [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]529dbb34-8cb7-7b13-ba64-9ae4c421f653, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 905.918828] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 905.919086] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 905.919271] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Deleting the datastore file [datastore2] 4ca19153-519c-49e3-bdfd-1f5ea77b24a0 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 905.919541] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f071d457-0242-4344-898f-6270bfc7a53c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.921428] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 905.921428] nova-compute[62208]: warnings.warn( [ 905.930293] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Waiting for the task: (returnval){ [ 905.930293] nova-compute[62208]: value = "task-38505" [ 905.930293] nova-compute[62208]: _type = "Task" [ 905.930293] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 905.934127] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 905.934127] nova-compute[62208]: warnings.warn( [ 905.939838] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Task: {'id': task-38505, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 906.361790] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 906.361790] nova-compute[62208]: warnings.warn( [ 906.367832] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 906.368077] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Creating directory with path [datastore2] vmware_temp/3d542211-9a3f-47af-b7f0-bdec95792ba6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 906.368322] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c95c1127-3bf2-45c8-bbe8-1e5a47dfd6a2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.370247] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 906.370247] nova-compute[62208]: warnings.warn( [ 906.380147] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Created directory with path [datastore2] vmware_temp/3d542211-9a3f-47af-b7f0-bdec95792ba6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 906.381841] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Fetch image to [datastore2] vmware_temp/3d542211-9a3f-47af-b7f0-bdec95792ba6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 906.381841] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/3d542211-9a3f-47af-b7f0-bdec95792ba6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 906.381841] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-809572af-5197-42c9-914d-43ac774e3bbf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.384819] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 906.384819] nova-compute[62208]: warnings.warn( [ 906.390790] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-157642bd-186c-4524-9820-efe58e96422e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.393725] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 906.393725] nova-compute[62208]: warnings.warn( [ 906.401912] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4361f9de-e9f6-45af-a8cf-1e1026031767 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.405809] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 906.405809] nova-compute[62208]: warnings.warn( [ 906.437826] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b30a833e-15cc-434b-9ec9-290b5d9f4f65 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.440266] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 906.440266] nova-compute[62208]: warnings.warn( [ 906.440699] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 906.440699] nova-compute[62208]: warnings.warn( [ 906.447334] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Task: {'id': task-38505, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.086389} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 906.447670] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-04af5464-d9c9-446c-9bec-c1985fa455a1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.449472] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 906.449721] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 906.449953] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 906.450219] nova-compute[62208]: INFO nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Took 0.61 seconds to destroy the instance on the hypervisor. [ 906.451740] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 906.451740] nova-compute[62208]: warnings.warn( [ 906.452541] nova-compute[62208]: DEBUG nova.compute.claims [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9375c4550> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 906.452730] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 906.453037] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 906.471410] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 906.593575] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3d542211-9a3f-47af-b7f0-bdec95792ba6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 906.651331] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 906.651531] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3d542211-9a3f-47af-b7f0-bdec95792ba6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 906.939843] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b1d46bb-6233-43e8-bf97-c55f5ce1be91 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.942258] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 906.942258] nova-compute[62208]: warnings.warn( [ 906.947325] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0fc2a1e-9ffc-4f19-b944-a67298e39a60 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.951057] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 906.951057] nova-compute[62208]: warnings.warn( [ 906.977275] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-244c0b81-8e1c-4c16-adc2-c2ff3b2edc60 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.980064] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 906.980064] nova-compute[62208]: warnings.warn( [ 906.985337] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1916f1a-0f89-46cc-94b5-3c17b7b5639f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.990133] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 906.990133] nova-compute[62208]: warnings.warn( [ 907.000275] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 907.009357] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 907.029910] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.577s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 907.030468] nova-compute[62208]: ERROR nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 907.030468] nova-compute[62208]: Faults: ['InvalidArgument'] [ 907.030468] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Traceback (most recent call last): [ 907.030468] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 907.030468] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] self.driver.spawn(context, instance, image_meta, [ 907.030468] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 907.030468] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 907.030468] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 907.030468] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] self._fetch_image_if_missing(context, vi) [ 907.030468] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 907.030468] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] image_cache(vi, tmp_image_ds_loc) [ 907.030468] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 907.030752] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] vm_util.copy_virtual_disk( [ 907.030752] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 907.030752] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] session._wait_for_task(vmdk_copy_task) [ 907.030752] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 907.030752] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] return self.wait_for_task(task_ref) [ 907.030752] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 907.030752] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] return evt.wait() [ 907.030752] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 907.030752] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] result = hub.switch() [ 907.030752] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 907.030752] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] return self.greenlet.switch() [ 907.030752] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 907.030752] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] self.f(*self.args, **self.kw) [ 907.031052] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 907.031052] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] raise exceptions.translate_fault(task_info.error) [ 907.031052] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 907.031052] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Faults: ['InvalidArgument'] [ 907.031052] nova-compute[62208]: ERROR nova.compute.manager [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] [ 907.031210] nova-compute[62208]: DEBUG nova.compute.utils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 907.032782] nova-compute[62208]: DEBUG nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Build of instance 4ca19153-519c-49e3-bdfd-1f5ea77b24a0 was re-scheduled: A specified parameter was not correct: fileType [ 907.032782] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 907.033191] nova-compute[62208]: DEBUG nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 907.033362] nova-compute[62208]: DEBUG nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 907.033539] nova-compute[62208]: DEBUG nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 907.033704] nova-compute[62208]: DEBUG nova.network.neutron [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 907.393291] nova-compute[62208]: DEBUG nova.network.neutron [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 907.406766] nova-compute[62208]: INFO nova.compute.manager [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Took 0.37 seconds to deallocate network for instance. [ 907.514385] nova-compute[62208]: INFO nova.scheduler.client.report [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Deleted allocations for instance 4ca19153-519c-49e3-bdfd-1f5ea77b24a0 [ 907.538515] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22dca4e6-6372-4c46-acaf-fc503ad7f5b0 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Lock "4ca19153-519c-49e3-bdfd-1f5ea77b24a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 325.464s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 907.539815] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c873058c-3d57-4fff-ad61-176916589d43 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Lock "4ca19153-519c-49e3-bdfd-1f5ea77b24a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 127.396s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 907.540098] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c873058c-3d57-4fff-ad61-176916589d43 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Acquiring lock "4ca19153-519c-49e3-bdfd-1f5ea77b24a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 907.540343] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c873058c-3d57-4fff-ad61-176916589d43 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Lock "4ca19153-519c-49e3-bdfd-1f5ea77b24a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 907.540549] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c873058c-3d57-4fff-ad61-176916589d43 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Lock "4ca19153-519c-49e3-bdfd-1f5ea77b24a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 907.542768] nova-compute[62208]: INFO nova.compute.manager [None req-c873058c-3d57-4fff-ad61-176916589d43 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Terminating instance [ 907.546083] nova-compute[62208]: DEBUG nova.compute.manager [None req-c873058c-3d57-4fff-ad61-176916589d43 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 907.546314] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c873058c-3d57-4fff-ad61-176916589d43 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 907.546599] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a09231f5-53fc-479f-a9aa-0d0fc50b911a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 907.548673] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 907.548673] nova-compute[62208]: warnings.warn( [ 907.556368] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8864e349-da30-4811-befd-71cab87d8b94 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 907.567333] nova-compute[62208]: DEBUG nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 907.569723] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 907.569723] nova-compute[62208]: warnings.warn( [ 907.589863] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-c873058c-3d57-4fff-ad61-176916589d43 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4ca19153-519c-49e3-bdfd-1f5ea77b24a0 could not be found. [ 907.590083] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c873058c-3d57-4fff-ad61-176916589d43 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 907.590262] nova-compute[62208]: INFO nova.compute.manager [None req-c873058c-3d57-4fff-ad61-176916589d43 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 907.590641] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-c873058c-3d57-4fff-ad61-176916589d43 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 907.590815] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 907.590913] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 907.624654] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 907.629977] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 907.630231] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 907.631750] nova-compute[62208]: INFO nova.compute.claims [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 907.634889] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 4ca19153-519c-49e3-bdfd-1f5ea77b24a0] Took 0.04 seconds to deallocate network for instance. [ 907.740816] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c873058c-3d57-4fff-ad61-176916589d43 tempest-FloatingIPsAssociationNegativeTestJSON-743228547 tempest-FloatingIPsAssociationNegativeTestJSON-743228547-project-member] Lock "4ca19153-519c-49e3-bdfd-1f5ea77b24a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.201s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 908.088874] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cab1d0c8-f5ff-4af7-a2a9-80468c398e6a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.091688] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 908.091688] nova-compute[62208]: warnings.warn( [ 908.097474] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb126a7a-07f2-4a5c-9d93-7a51a0441e37 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.100526] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 908.100526] nova-compute[62208]: warnings.warn( [ 908.127595] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1963060-fa42-4012-8548-3f4609f65e7b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.131245] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 908.131245] nova-compute[62208]: warnings.warn( [ 908.137387] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ae226f6-4c0a-4bc2-bba3-6a0bcc5fd86b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.141177] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 908.141177] nova-compute[62208]: warnings.warn( [ 908.151125] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 908.160478] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 908.176439] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.546s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 908.177108] nova-compute[62208]: DEBUG nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 908.226377] nova-compute[62208]: DEBUG nova.compute.utils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 908.228012] nova-compute[62208]: DEBUG nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 908.228210] nova-compute[62208]: DEBUG nova.network.neutron [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 908.238896] nova-compute[62208]: DEBUG nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 908.292671] nova-compute[62208]: DEBUG nova.policy [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9ff061125e3b4c9eb1775be8a58f29b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef379fcf2b9d4456bcbb35bc5f51afb3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 908.310756] nova-compute[62208]: DEBUG nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 908.336688] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 908.336955] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 908.337112] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 908.337293] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 908.337439] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 908.337640] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 908.337875] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 908.338041] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 908.338209] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 908.338369] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 908.338562] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 908.339705] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8f5b700-1347-4b59-bc7a-e71dd0d656e8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.342228] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 908.342228] nova-compute[62208]: warnings.warn( [ 908.349706] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29788986-86e8-48e2-b8be-abc47e12c334 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.354907] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 908.354907] nova-compute[62208]: warnings.warn( [ 908.864268] nova-compute[62208]: DEBUG nova.network.neutron [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Successfully created port: 158237e6-1cae-4463-a849-c7636a7e9502 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 909.651964] nova-compute[62208]: DEBUG nova.network.neutron [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Successfully updated port: 158237e6-1cae-4463-a849-c7636a7e9502 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 909.663872] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquiring lock "refresh_cache-f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 909.663966] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquired lock "refresh_cache-f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 909.664147] nova-compute[62208]: DEBUG nova.network.neutron [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 909.753853] nova-compute[62208]: DEBUG nova.network.neutron [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 909.864755] nova-compute[62208]: DEBUG nova.compute.manager [req-d987f5b9-885f-4a26-94ac-cce90412d7d9 req-94886521-dcb0-4d65-a862-5b9d100663c6 service nova] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Received event network-vif-plugged-158237e6-1cae-4463-a849-c7636a7e9502 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 909.865034] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-d987f5b9-885f-4a26-94ac-cce90412d7d9 req-94886521-dcb0-4d65-a862-5b9d100663c6 service nova] Acquiring lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 909.865820] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-d987f5b9-885f-4a26-94ac-cce90412d7d9 req-94886521-dcb0-4d65-a862-5b9d100663c6 service nova] Lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 909.866170] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-d987f5b9-885f-4a26-94ac-cce90412d7d9 req-94886521-dcb0-4d65-a862-5b9d100663c6 service nova] Lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 909.866479] nova-compute[62208]: DEBUG nova.compute.manager [req-d987f5b9-885f-4a26-94ac-cce90412d7d9 req-94886521-dcb0-4d65-a862-5b9d100663c6 service nova] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] No waiting events found dispatching network-vif-plugged-158237e6-1cae-4463-a849-c7636a7e9502 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 909.866674] nova-compute[62208]: WARNING nova.compute.manager [req-d987f5b9-885f-4a26-94ac-cce90412d7d9 req-94886521-dcb0-4d65-a862-5b9d100663c6 service nova] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Received unexpected event network-vif-plugged-158237e6-1cae-4463-a849-c7636a7e9502 for instance with vm_state building and task_state spawning. [ 909.981390] nova-compute[62208]: DEBUG nova.network.neutron [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Updating instance_info_cache with network_info: [{"id": "158237e6-1cae-4463-a849-c7636a7e9502", "address": "fa:16:3e:1c:1b:8e", "network": {"id": "a9103498-45dd-4439-b419-ebb4901404d3", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1468906780-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "ef379fcf2b9d4456bcbb35bc5f51afb3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fd0eb882-ab95-4373-aa20-ee565a9919e3", "external-id": "nsx-vlan-transportzone-510", "segmentation_id": 510, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap158237e6-1c", "ovs_interfaceid": "158237e6-1cae-4463-a849-c7636a7e9502", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 909.994914] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Releasing lock "refresh_cache-f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 909.995261] nova-compute[62208]: DEBUG nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Instance network_info: |[{"id": "158237e6-1cae-4463-a849-c7636a7e9502", "address": "fa:16:3e:1c:1b:8e", "network": {"id": "a9103498-45dd-4439-b419-ebb4901404d3", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1468906780-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "ef379fcf2b9d4456bcbb35bc5f51afb3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fd0eb882-ab95-4373-aa20-ee565a9919e3", "external-id": "nsx-vlan-transportzone-510", "segmentation_id": 510, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap158237e6-1c", "ovs_interfaceid": "158237e6-1cae-4463-a849-c7636a7e9502", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 909.995672] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1c:1b:8e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fd0eb882-ab95-4373-aa20-ee565a9919e3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '158237e6-1cae-4463-a849-c7636a7e9502', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 910.003103] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Creating folder: Project (ef379fcf2b9d4456bcbb35bc5f51afb3). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 910.003949] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f29a3af9-6f33-4fdb-afb1-1947c9628338 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 910.006523] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 910.006523] nova-compute[62208]: warnings.warn( [ 910.016335] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Created folder: Project (ef379fcf2b9d4456bcbb35bc5f51afb3) in parent group-v17427. [ 910.016545] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Creating folder: Instances. Parent ref: group-v17503. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 910.016811] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1ff2d89c-e8eb-48f1-be28-2cbfde1db780 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 910.018488] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 910.018488] nova-compute[62208]: warnings.warn( [ 910.026291] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Created folder: Instances in parent group-v17503. [ 910.026713] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 910.026920] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 910.027125] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-53d27ae4-5792-4359-a745-9b04109ee3ee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 910.041686] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 910.041686] nova-compute[62208]: warnings.warn( [ 910.048105] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 910.048105] nova-compute[62208]: value = "task-38508" [ 910.048105] nova-compute[62208]: _type = "Task" [ 910.048105] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 910.051348] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 910.051348] nova-compute[62208]: warnings.warn( [ 910.056368] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38508, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 910.552312] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 910.552312] nova-compute[62208]: warnings.warn( [ 910.558445] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38508, 'name': CreateVM_Task, 'duration_secs': 0.323082} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 910.558838] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 910.559563] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 910.559996] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 910.563224] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bd374c3-5855-40af-8439-f49df7217f2d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 910.575943] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 910.575943] nova-compute[62208]: warnings.warn( [ 910.600494] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Reconfiguring VM instance to enable vnc on port - 5906 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 910.600974] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-dff19d90-699e-476a-a723-81fe73c715ee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 910.614890] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 910.614890] nova-compute[62208]: warnings.warn( [ 910.622662] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Waiting for the task: (returnval){ [ 910.622662] nova-compute[62208]: value = "task-38509" [ 910.622662] nova-compute[62208]: _type = "Task" [ 910.622662] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 910.626325] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 910.626325] nova-compute[62208]: warnings.warn( [ 910.636257] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Task: {'id': task-38509, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 911.126635] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 911.126635] nova-compute[62208]: warnings.warn( [ 911.133007] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Task: {'id': task-38509, 'name': ReconfigVM_Task, 'duration_secs': 0.110437} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 911.133308] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Reconfigured VM instance to enable vnc on port - 5906 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 911.133535] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.574s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 911.133857] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 911.134015] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 911.134348] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 911.134628] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8247206b-87cf-48fc-b7f1-9a056437be0b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 911.136599] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 911.136599] nova-compute[62208]: warnings.warn( [ 911.141677] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Waiting for the task: (returnval){ [ 911.141677] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]528d4ef8-1105-95a9-dc6a-a874e1e50471" [ 911.141677] nova-compute[62208]: _type = "Task" [ 911.141677] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 911.145965] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 911.145965] nova-compute[62208]: warnings.warn( [ 911.151791] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]528d4ef8-1105-95a9-dc6a-a874e1e50471, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 911.646938] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 911.646938] nova-compute[62208]: warnings.warn( [ 911.653336] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 911.653744] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 911.654090] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 911.892526] nova-compute[62208]: DEBUG nova.compute.manager [req-8328f9a5-0e6e-4922-af73-9c7569a9a9a8 req-ff4f2735-90cf-408b-b446-4eb03d7dbb1d service nova] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Received event network-changed-158237e6-1cae-4463-a849-c7636a7e9502 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 911.892727] nova-compute[62208]: DEBUG nova.compute.manager [req-8328f9a5-0e6e-4922-af73-9c7569a9a9a8 req-ff4f2735-90cf-408b-b446-4eb03d7dbb1d service nova] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Refreshing instance network info cache due to event network-changed-158237e6-1cae-4463-a849-c7636a7e9502. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 911.892947] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-8328f9a5-0e6e-4922-af73-9c7569a9a9a8 req-ff4f2735-90cf-408b-b446-4eb03d7dbb1d service nova] Acquiring lock "refresh_cache-f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 911.893143] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-8328f9a5-0e6e-4922-af73-9c7569a9a9a8 req-ff4f2735-90cf-408b-b446-4eb03d7dbb1d service nova] Acquired lock "refresh_cache-f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 911.893238] nova-compute[62208]: DEBUG nova.network.neutron [req-8328f9a5-0e6e-4922-af73-9c7569a9a9a8 req-ff4f2735-90cf-408b-b446-4eb03d7dbb1d service nova] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Refreshing network info cache for port 158237e6-1cae-4463-a849-c7636a7e9502 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 912.142534] nova-compute[62208]: DEBUG nova.network.neutron [req-8328f9a5-0e6e-4922-af73-9c7569a9a9a8 req-ff4f2735-90cf-408b-b446-4eb03d7dbb1d service nova] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Updated VIF entry in instance network info cache for port 158237e6-1cae-4463-a849-c7636a7e9502. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 912.142898] nova-compute[62208]: DEBUG nova.network.neutron [req-8328f9a5-0e6e-4922-af73-9c7569a9a9a8 req-ff4f2735-90cf-408b-b446-4eb03d7dbb1d service nova] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Updating instance_info_cache with network_info: [{"id": "158237e6-1cae-4463-a849-c7636a7e9502", "address": "fa:16:3e:1c:1b:8e", "network": {"id": "a9103498-45dd-4439-b419-ebb4901404d3", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1468906780-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "ef379fcf2b9d4456bcbb35bc5f51afb3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fd0eb882-ab95-4373-aa20-ee565a9919e3", "external-id": "nsx-vlan-transportzone-510", "segmentation_id": 510, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap158237e6-1c", "ovs_interfaceid": "158237e6-1cae-4463-a849-c7636a7e9502", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 912.152530] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-8328f9a5-0e6e-4922-af73-9c7569a9a9a8 req-ff4f2735-90cf-408b-b446-4eb03d7dbb1d service nova] Releasing lock "refresh_cache-f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 915.391412] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a7e5f04c-5449-4e98-a843-a049bd096f1f tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquiring lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 918.822832] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Acquiring lock "d1a22d6e-d913-47de-9188-507d2475f745" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 918.823149] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Lock "d1a22d6e-d913-47de-9188-507d2475f745" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 934.140919] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 935.140979] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 937.141204] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 937.141504] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 937.141504] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 937.161550] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 937.161803] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 937.161968] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 937.162102] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 937.162229] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 937.162351] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 937.162471] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 937.162590] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 937.162706] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 937.162822] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 937.162939] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 937.163467] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 937.163641] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 937.163775] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 938.141434] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 939.141762] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 939.142068] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 939.152698] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 939.152916] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 939.153080] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 939.153231] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 939.154404] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74939b05-29ac-452a-844b-66da77de64ee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 939.159262] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 939.159262] nova-compute[62208]: warnings.warn( [ 939.165148] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c842e6ad-b788-4e68-a00d-73b5b47b12f3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 939.168997] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 939.168997] nova-compute[62208]: warnings.warn( [ 939.179495] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0f482c6-b9a5-4c60-b973-e542c68480ee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 939.181706] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 939.181706] nova-compute[62208]: warnings.warn( [ 939.185941] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d0b372a-8bfa-4ade-8bdd-50cc1abd2d5c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 939.189098] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 939.189098] nova-compute[62208]: warnings.warn( [ 939.215808] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181929MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 939.215965] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 939.216187] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 939.287710] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bd0eef47-56e8-45b6-92b1-e81400994572 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 939.287898] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9f48db49-1618-4b04-88a6-315c0f9b889a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 939.288037] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 858b585c-7746-4d38-84c9-b3ee719eb406 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 939.288164] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d2839fc7-a754-4d4f-a7f1-e4b17f2126a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 939.288285] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 08336643-4254-4447-b7c2-b81054bf9707 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 939.288403] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance af8885cb-afba-4724-be10-083e16f8bfc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 939.288520] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 939.288661] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 939.288829] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 939.288958] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 939.300902] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c4512476-9905-4f33-8575-d0a0f24ed4d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.311677] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b619a949-11d4-4178-9424-54841ee6c26e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.322154] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ecd5e5ca-bcef-45ed-b7b5-fd572d80bd50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.333852] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.344659] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 805622b8-0c67-464e-a666-bd553818e796 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.354821] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 0b350d21-7644-49d1-a3d6-f2c069de2f0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.364957] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c3083789-9915-42dc-9345-22dabdbec850 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.374979] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 235a9ab6-3be0-4205-bdb9-8b85c93f0846 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.384817] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9db515f2-7484-4478-86e0-2e715e08646a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.394938] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 21ce7403-a26b-452f-948a-2e32c606ce00 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.411746] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5672475f-45dd-460d-bb7b-f53dfee798b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.425096] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 61cc043c-1d4a-4a47-86b0-cc4fb61abed4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.435929] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 6e798882-aa11-4c1b-891c-7428d2bba113 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.446774] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d5c7531e-b496-4aed-be05-f1a96391e327 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.460531] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b03b2152-c17a-4757-b1ad-8e2fd1430fb3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.473989] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ab417ba8-5304-4dfd-a08e-102a43996d9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.484449] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance a2d62b77-29a3-4813-b4db-782e1aa52834 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.496105] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3cd22f76-71d2-4d03-88c6-10192bb9418e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.506215] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d1a22d6e-d913-47de-9188-507d2475f745 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 939.506470] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 939.506615] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 939.889157] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b1b7470-2b6a-484c-9a42-006469471198 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 939.892132] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 939.892132] nova-compute[62208]: warnings.warn( [ 939.897300] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0a2e7ca-0226-4296-81be-17f750ab887a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 939.900281] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 939.900281] nova-compute[62208]: warnings.warn( [ 939.926797] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bef4350-6783-4276-9040-4fb1d5d9a4a3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 939.929253] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 939.929253] nova-compute[62208]: warnings.warn( [ 939.934696] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df27fb1b-59b1-4ecb-81af-e859bbf78817 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 939.938356] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 939.938356] nova-compute[62208]: warnings.warn( [ 939.947811] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 939.956469] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 939.974091] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 939.974364] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.758s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 941.968615] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 955.936809] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 955.936809] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 955.936809] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 955.936809] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 955.936809] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 955.936809] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 955.936809] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 955.936809] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 955.936809] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 955.936809] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 955.936809] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 955.936809] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 955.937787] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/3d542211-9a3f-47af-b7f0-bdec95792ba6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 955.939233] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 955.939743] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Copying Virtual Disk [datastore2] vmware_temp/3d542211-9a3f-47af-b7f0-bdec95792ba6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/3d542211-9a3f-47af-b7f0-bdec95792ba6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 955.939743] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cc71d5e3-fd06-4018-bd5b-714bb6b5fd71 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 955.941909] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 955.941909] nova-compute[62208]: warnings.warn( [ 955.948401] nova-compute[62208]: DEBUG oslo_vmware.api [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Waiting for the task: (returnval){ [ 955.948401] nova-compute[62208]: value = "task-38510" [ 955.948401] nova-compute[62208]: _type = "Task" [ 955.948401] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 955.952257] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 955.952257] nova-compute[62208]: warnings.warn( [ 955.957560] nova-compute[62208]: DEBUG oslo_vmware.api [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Task: {'id': task-38510, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 956.452533] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.452533] nova-compute[62208]: warnings.warn( [ 956.458415] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 956.458809] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 956.459270] nova-compute[62208]: ERROR nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 956.459270] nova-compute[62208]: Faults: ['InvalidArgument'] [ 956.459270] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Traceback (most recent call last): [ 956.459270] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 956.459270] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] yield resources [ 956.459270] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 956.459270] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] self.driver.spawn(context, instance, image_meta, [ 956.459270] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 956.459270] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] self._vmops.spawn(context, instance, image_meta, injected_files, [ 956.459270] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 956.459270] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] self._fetch_image_if_missing(context, vi) [ 956.459270] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 956.459584] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] image_cache(vi, tmp_image_ds_loc) [ 956.459584] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 956.459584] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] vm_util.copy_virtual_disk( [ 956.459584] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 956.459584] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] session._wait_for_task(vmdk_copy_task) [ 956.459584] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 956.459584] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] return self.wait_for_task(task_ref) [ 956.459584] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 956.459584] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] return evt.wait() [ 956.459584] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 956.459584] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] result = hub.switch() [ 956.459584] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 956.459584] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] return self.greenlet.switch() [ 956.459972] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 956.459972] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] self.f(*self.args, **self.kw) [ 956.459972] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 956.459972] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] raise exceptions.translate_fault(task_info.error) [ 956.459972] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 956.459972] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Faults: ['InvalidArgument'] [ 956.459972] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] [ 956.459972] nova-compute[62208]: INFO nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Terminating instance [ 956.462074] nova-compute[62208]: DEBUG nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 956.462267] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 956.462546] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 956.462736] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 956.463461] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85fa60d4-d63f-4f66-acf7-abab465dd04c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.466263] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-63b85fc7-ba1a-487c-8d04-188667efcabe {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.467794] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.467794] nova-compute[62208]: warnings.warn( [ 956.468187] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.468187] nova-compute[62208]: warnings.warn( [ 956.472980] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 956.473256] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0f32d92e-b439-4958-b2d3-91dec82eca20 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.475562] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 956.475736] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 956.476951] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.476951] nova-compute[62208]: warnings.warn( [ 956.477375] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7d442f9a-7b62-4b04-947b-f1aafc92bc41 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.479589] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.479589] nova-compute[62208]: warnings.warn( [ 956.482520] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Waiting for the task: (returnval){ [ 956.482520] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]521d433a-3472-dc55-4839-6f9101b060f9" [ 956.482520] nova-compute[62208]: _type = "Task" [ 956.482520] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 956.485221] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.485221] nova-compute[62208]: warnings.warn( [ 956.499611] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 956.499901] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Creating directory with path [datastore2] vmware_temp/0887a6d8-8324-4539-b5c5-4293eb9af436/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 956.500168] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-adbbbdf5-d511-4a43-9b36-bf7ac08def30 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.502195] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.502195] nova-compute[62208]: warnings.warn( [ 956.521842] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Created directory with path [datastore2] vmware_temp/0887a6d8-8324-4539-b5c5-4293eb9af436/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 956.521981] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Fetch image to [datastore2] vmware_temp/0887a6d8-8324-4539-b5c5-4293eb9af436/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 956.522154] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/0887a6d8-8324-4539-b5c5-4293eb9af436/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 956.522968] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b515b509-dbaf-4923-ab62-30866fb1b5ee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.525408] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.525408] nova-compute[62208]: warnings.warn( [ 956.530528] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52582b8f-4c04-4870-9dd8-bc0306be8a01 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.532848] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.532848] nova-compute[62208]: warnings.warn( [ 956.540482] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c025d507-46f7-4afb-a6b6-4c100305f78e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.544144] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.544144] nova-compute[62208]: warnings.warn( [ 956.547287] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 956.547505] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 956.547749] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Deleting the datastore file [datastore2] bd0eef47-56e8-45b6-92b1-e81400994572 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 956.548012] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-70b926b4-6873-4919-8688-9941e5801138 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.574483] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.574483] nova-compute[62208]: warnings.warn( [ 956.575675] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c047628-aebd-40bc-8cfd-e3eb4e0f0b5c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.578585] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.578585] nova-compute[62208]: warnings.warn( [ 956.580774] nova-compute[62208]: DEBUG oslo_vmware.api [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Waiting for the task: (returnval){ [ 956.580774] nova-compute[62208]: value = "task-38512" [ 956.580774] nova-compute[62208]: _type = "Task" [ 956.580774] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 956.585638] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7a23efab-8c2a-4a63-ae60-34f5caedde7c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.587298] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.587298] nova-compute[62208]: warnings.warn( [ 956.587640] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 956.587640] nova-compute[62208]: warnings.warn( [ 956.592439] nova-compute[62208]: DEBUG oslo_vmware.api [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Task: {'id': task-38512, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 956.609412] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 956.660434] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0887a6d8-8324-4539-b5c5-4293eb9af436/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 956.716354] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 956.716528] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0887a6d8-8324-4539-b5c5-4293eb9af436/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 957.086724] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 957.086724] nova-compute[62208]: warnings.warn( [ 957.093953] nova-compute[62208]: DEBUG oslo_vmware.api [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Task: {'id': task-38512, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075269} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 957.094318] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 957.094534] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 957.094713] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 957.094896] nova-compute[62208]: INFO nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Took 0.63 seconds to destroy the instance on the hypervisor. [ 957.097092] nova-compute[62208]: DEBUG nova.compute.claims [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9373409a0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 957.097264] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 957.097481] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 957.504965] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e564ad01-16cf-45cb-91aa-0167e07e43b7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.507666] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 957.507666] nova-compute[62208]: warnings.warn( [ 957.513286] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38937ec1-5c50-4228-a3ae-7aec3ea0f1d1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.516694] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 957.516694] nova-compute[62208]: warnings.warn( [ 957.543402] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87fd5992-0e3c-4814-8795-0f400e4c8346 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.547153] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 957.547153] nova-compute[62208]: warnings.warn( [ 957.554507] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30b81da4-db8d-45b6-ba8a-f62265c7389a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.560532] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 957.560532] nova-compute[62208]: warnings.warn( [ 957.576237] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 957.585660] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 957.603767] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.506s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 957.604358] nova-compute[62208]: ERROR nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 957.604358] nova-compute[62208]: Faults: ['InvalidArgument'] [ 957.604358] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Traceback (most recent call last): [ 957.604358] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 957.604358] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] self.driver.spawn(context, instance, image_meta, [ 957.604358] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 957.604358] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] self._vmops.spawn(context, instance, image_meta, injected_files, [ 957.604358] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 957.604358] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] self._fetch_image_if_missing(context, vi) [ 957.604358] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 957.604358] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] image_cache(vi, tmp_image_ds_loc) [ 957.604358] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 957.604681] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] vm_util.copy_virtual_disk( [ 957.604681] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 957.604681] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] session._wait_for_task(vmdk_copy_task) [ 957.604681] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 957.604681] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] return self.wait_for_task(task_ref) [ 957.604681] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 957.604681] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] return evt.wait() [ 957.604681] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 957.604681] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] result = hub.switch() [ 957.604681] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 957.604681] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] return self.greenlet.switch() [ 957.604681] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 957.604681] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] self.f(*self.args, **self.kw) [ 957.605025] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 957.605025] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] raise exceptions.translate_fault(task_info.error) [ 957.605025] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 957.605025] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Faults: ['InvalidArgument'] [ 957.605025] nova-compute[62208]: ERROR nova.compute.manager [instance: bd0eef47-56e8-45b6-92b1-e81400994572] [ 957.605242] nova-compute[62208]: DEBUG nova.compute.utils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 957.606805] nova-compute[62208]: DEBUG nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Build of instance bd0eef47-56e8-45b6-92b1-e81400994572 was re-scheduled: A specified parameter was not correct: fileType [ 957.606805] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 957.607188] nova-compute[62208]: DEBUG nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 957.607367] nova-compute[62208]: DEBUG nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 957.607545] nova-compute[62208]: DEBUG nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 957.607769] nova-compute[62208]: DEBUG nova.network.neutron [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 957.884330] nova-compute[62208]: DEBUG nova.network.neutron [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 957.900657] nova-compute[62208]: INFO nova.compute.manager [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Took 0.29 seconds to deallocate network for instance. [ 957.996342] nova-compute[62208]: INFO nova.scheduler.client.report [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Deleted allocations for instance bd0eef47-56e8-45b6-92b1-e81400994572 [ 958.017868] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9e2885f0-382c-444d-9c71-ff0848070f95 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Lock "bd0eef47-56e8-45b6-92b1-e81400994572" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 374.921s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 958.019199] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-208c1493-34fe-4445-981e-41324b7a6b12 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Lock "bd0eef47-56e8-45b6-92b1-e81400994572" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 176.368s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 958.019436] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-208c1493-34fe-4445-981e-41324b7a6b12 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Acquiring lock "bd0eef47-56e8-45b6-92b1-e81400994572-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 958.019619] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-208c1493-34fe-4445-981e-41324b7a6b12 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Lock "bd0eef47-56e8-45b6-92b1-e81400994572-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 958.019779] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-208c1493-34fe-4445-981e-41324b7a6b12 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Lock "bd0eef47-56e8-45b6-92b1-e81400994572-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 958.022019] nova-compute[62208]: INFO nova.compute.manager [None req-208c1493-34fe-4445-981e-41324b7a6b12 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Terminating instance [ 958.024259] nova-compute[62208]: DEBUG nova.compute.manager [None req-208c1493-34fe-4445-981e-41324b7a6b12 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 958.024447] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-208c1493-34fe-4445-981e-41324b7a6b12 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 958.025009] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ada88059-64a9-4de0-8e64-7d7c95310da9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.027085] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 958.027085] nova-compute[62208]: warnings.warn( [ 958.035564] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ce1ac10-a43f-48b6-816d-c4cc120c25e1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.048539] nova-compute[62208]: DEBUG nova.compute.manager [None req-f219345c-7bf9-4222-833a-687f7f181060 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: e21efb06-821b-4bec-a6d9-f57ae59d038a] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 958.050871] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 958.050871] nova-compute[62208]: warnings.warn( [ 958.069290] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-208c1493-34fe-4445-981e-41324b7a6b12 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bd0eef47-56e8-45b6-92b1-e81400994572 could not be found. [ 958.069546] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-208c1493-34fe-4445-981e-41324b7a6b12 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 958.069731] nova-compute[62208]: INFO nova.compute.manager [None req-208c1493-34fe-4445-981e-41324b7a6b12 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Took 0.05 seconds to destroy the instance on the hypervisor. [ 958.069974] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-208c1493-34fe-4445-981e-41324b7a6b12 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 958.070192] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 958.070288] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 958.074017] nova-compute[62208]: DEBUG nova.compute.manager [None req-f219345c-7bf9-4222-833a-687f7f181060 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: e21efb06-821b-4bec-a6d9-f57ae59d038a] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 958.097881] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 958.100211] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f219345c-7bf9-4222-833a-687f7f181060 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "e21efb06-821b-4bec-a6d9-f57ae59d038a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 238.725s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 958.106748] nova-compute[62208]: INFO nova.compute.manager [-] [instance: bd0eef47-56e8-45b6-92b1-e81400994572] Took 0.04 seconds to deallocate network for instance. [ 958.113425] nova-compute[62208]: DEBUG nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 958.174040] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 958.174296] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 958.175758] nova-compute[62208]: INFO nova.compute.claims [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 958.203705] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-208c1493-34fe-4445-981e-41324b7a6b12 tempest-FloatingIPsAssociationTestJSON-830652871 tempest-FloatingIPsAssociationTestJSON-830652871-project-member] Lock "bd0eef47-56e8-45b6-92b1-e81400994572" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.184s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 958.577490] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e59dd0eb-6fb7-4bad-a5cf-4d815e90645e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.580144] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 958.580144] nova-compute[62208]: warnings.warn( [ 958.585825] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f0cea51-f4f6-43fb-9f35-959f698cb000 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.588938] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 958.588938] nova-compute[62208]: warnings.warn( [ 958.616578] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20a3562b-8a23-4088-a8eb-c10f09f6f239 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.619283] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 958.619283] nova-compute[62208]: warnings.warn( [ 958.625637] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf67f011-5c17-467e-971b-4360a1d80425 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.629478] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 958.629478] nova-compute[62208]: warnings.warn( [ 958.639769] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 958.648982] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 958.667062] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.493s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 958.667552] nova-compute[62208]: DEBUG nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 958.703348] nova-compute[62208]: DEBUG nova.compute.utils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 958.707854] nova-compute[62208]: DEBUG nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 958.707979] nova-compute[62208]: DEBUG nova.network.neutron [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 958.719435] nova-compute[62208]: DEBUG nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 958.747414] nova-compute[62208]: DEBUG nova.network.neutron [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] No network configured {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1188}} [ 958.747588] nova-compute[62208]: DEBUG nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Instance network_info: |[]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 958.788296] nova-compute[62208]: DEBUG nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 958.809572] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 958.809820] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 958.809971] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 958.810147] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 958.810284] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 958.810426] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 958.810635] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 958.810797] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 958.810964] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 958.811124] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 958.811297] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 958.812161] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e08ccea-9b14-458e-bb7a-0b24c717b051 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.814615] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 958.814615] nova-compute[62208]: warnings.warn( [ 958.820914] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3151513d-4fbf-4052-84cb-2cf81d2b71d3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.826254] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 958.826254] nova-compute[62208]: warnings.warn( [ 958.836677] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 958.844854] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 958.844854] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 958.844854] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ca4c42e6-7899-4b0c-b7c4-819b851d3d83 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.853846] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 958.853846] nova-compute[62208]: warnings.warn( [ 958.860196] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 958.860196] nova-compute[62208]: value = "task-38513" [ 958.860196] nova-compute[62208]: _type = "Task" [ 958.860196] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 958.863280] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 958.863280] nova-compute[62208]: warnings.warn( [ 958.868191] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38513, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 959.363836] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 959.363836] nova-compute[62208]: warnings.warn( [ 959.370022] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38513, 'name': CreateVM_Task, 'duration_secs': 0.241528} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 959.370199] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 959.370524] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 959.370777] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 959.373552] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d189dd3-1b81-4961-85d8-4e6cf85ab65e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.386223] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 959.386223] nova-compute[62208]: warnings.warn( [ 959.409238] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Reconfiguring VM instance to enable vnc on port - 5907 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 959.409575] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-db54a77b-8d15-477c-bca9-a047b0bc563e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.420679] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 959.420679] nova-compute[62208]: warnings.warn( [ 959.426298] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for the task: (returnval){ [ 959.426298] nova-compute[62208]: value = "task-38514" [ 959.426298] nova-compute[62208]: _type = "Task" [ 959.426298] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 959.431136] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 959.431136] nova-compute[62208]: warnings.warn( [ 959.437069] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Task: {'id': task-38514, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 959.930525] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 959.930525] nova-compute[62208]: warnings.warn( [ 959.938640] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Task: {'id': task-38514, 'name': ReconfigVM_Task, 'duration_secs': 0.103744} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 959.940196] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Reconfigured VM instance to enable vnc on port - 5907 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 959.940480] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.570s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 959.940758] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 959.940910] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 959.941258] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 959.941556] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0094ecea-f317-4724-be97-994d15b3f5cf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.943360] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 959.943360] nova-compute[62208]: warnings.warn( [ 959.947401] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for the task: (returnval){ [ 959.947401] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52cfcb7b-caaa-ab97-98be-e56108684af2" [ 959.947401] nova-compute[62208]: _type = "Task" [ 959.947401] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 959.951202] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 959.951202] nova-compute[62208]: warnings.warn( [ 959.963925] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52cfcb7b-caaa-ab97-98be-e56108684af2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 960.451552] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 960.451552] nova-compute[62208]: warnings.warn( [ 960.458377] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 960.458671] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 960.458894] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 961.465645] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "c4512476-9905-4f33-8575-d0a0f24ed4d5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 967.212553] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "7f79eba6-e15c-4402-b46b-028d552a81d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 967.212949] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "7f79eba6-e15c-4402-b46b-028d552a81d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 970.852336] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-20ff9ed8-8b8b-4967-a5d9-1b0e1faf3a30 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "79777c07-535b-43ca-9d49-7d595da14adc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 970.852678] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-20ff9ed8-8b8b-4967-a5d9-1b0e1faf3a30 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "79777c07-535b-43ca-9d49-7d595da14adc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 996.142837] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 996.142837] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 997.136679] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 997.159453] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 997.159715] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 997.159749] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 997.181544] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 997.181705] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 997.181834] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 997.181953] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 997.182111] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 997.182176] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 997.182287] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 997.182413] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 997.182522] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 997.182635] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 997.182751] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 998.140605] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 998.140805] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 999.141447] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1000.140849] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1000.141071] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1000.152970] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1000.153311] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1000.153311] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1000.153467] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1000.154550] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f85adb4-c225-4b2e-8020-fb599fefc17f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.157849] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1000.157849] nova-compute[62208]: warnings.warn( [ 1000.164084] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-854416ed-654a-4ad8-b5b7-206f8043e897 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.167913] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1000.167913] nova-compute[62208]: warnings.warn( [ 1000.188046] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e37d1b81-bfc9-4b36-abc0-5a1c07a74a25 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.190555] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1000.190555] nova-compute[62208]: warnings.warn( [ 1000.196501] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab053485-fbef-43a8-bf6a-eb6b79effc34 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.201295] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1000.201295] nova-compute[62208]: warnings.warn( [ 1000.231585] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181939MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1000.231749] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1000.231936] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1000.325713] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9f48db49-1618-4b04-88a6-315c0f9b889a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1000.325876] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 858b585c-7746-4d38-84c9-b3ee719eb406 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1000.325999] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d2839fc7-a754-4d4f-a7f1-e4b17f2126a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1000.326116] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 08336643-4254-4447-b7c2-b81054bf9707 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1000.326229] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance af8885cb-afba-4724-be10-083e16f8bfc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1000.326343] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1000.326456] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1000.326571] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1000.326683] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1000.326793] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c4512476-9905-4f33-8575-d0a0f24ed4d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1000.341088] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.358500] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 805622b8-0c67-464e-a666-bd553818e796 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.367015] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 0b350d21-7644-49d1-a3d6-f2c069de2f0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.388221] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c3083789-9915-42dc-9345-22dabdbec850 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.402655] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 235a9ab6-3be0-4205-bdb9-8b85c93f0846 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.414887] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9db515f2-7484-4478-86e0-2e715e08646a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.430239] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 21ce7403-a26b-452f-948a-2e32c606ce00 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.441786] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5672475f-45dd-460d-bb7b-f53dfee798b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.453962] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 61cc043c-1d4a-4a47-86b0-cc4fb61abed4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.468502] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 6e798882-aa11-4c1b-891c-7428d2bba113 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.485844] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d5c7531e-b496-4aed-be05-f1a96391e327 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.499319] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b03b2152-c17a-4757-b1ad-8e2fd1430fb3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.510561] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ab417ba8-5304-4dfd-a08e-102a43996d9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.522135] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance a2d62b77-29a3-4813-b4db-782e1aa52834 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.533255] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3cd22f76-71d2-4d03-88c6-10192bb9418e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.544506] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d1a22d6e-d913-47de-9188-507d2475f745 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.555681] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7f79eba6-e15c-4402-b46b-028d552a81d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.572793] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 79777c07-535b-43ca-9d49-7d595da14adc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1000.573185] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1000.573413] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1001.025907] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9074ced5-9774-4773-9e90-c91c0da796f5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.028669] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1001.028669] nova-compute[62208]: warnings.warn( [ 1001.034786] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-021ccb67-abd8-4fb2-afa7-79bc67c2cd0e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.038082] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1001.038082] nova-compute[62208]: warnings.warn( [ 1001.068990] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e0d556d-d5b7-4125-99eb-fcdf1be3ed8c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.071887] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1001.071887] nova-compute[62208]: warnings.warn( [ 1001.081082] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cf9f5e0-bb93-4cbd-8caa-5a6d41c141be {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.083725] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1001.083725] nova-compute[62208]: warnings.warn( [ 1001.092371] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1001.104028] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1001.129024] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1001.129239] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.897s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1001.404385] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-76e638d2-c0a1-41d2-a003-4ee13e404c94 tempest-ImagesNegativeTestJSON-1148079054 tempest-ImagesNegativeTestJSON-1148079054-project-member] Acquiring lock "6f86dbc6-cfa0-422f-8c80-93c47a7764df" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1001.404669] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-76e638d2-c0a1-41d2-a003-4ee13e404c94 tempest-ImagesNegativeTestJSON-1148079054 tempest-ImagesNegativeTestJSON-1148079054-project-member] Lock "6f86dbc6-cfa0-422f-8c80-93c47a7764df" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1002.124464] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1002.124568] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1004.626447] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1004.626447] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1004.626447] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1004.626447] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1004.626447] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1004.626447] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1004.626447] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1004.626447] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1004.626447] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1004.626447] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1004.626447] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1004.626447] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1004.626447] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/0887a6d8-8324-4539-b5c5-4293eb9af436/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1004.627261] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1004.627261] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Copying Virtual Disk [datastore2] vmware_temp/0887a6d8-8324-4539-b5c5-4293eb9af436/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/0887a6d8-8324-4539-b5c5-4293eb9af436/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1004.627261] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1b956a78-10f7-4800-8c7c-8ed42d942a4f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.627261] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1004.627261] nova-compute[62208]: warnings.warn( [ 1004.633084] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Waiting for the task: (returnval){ [ 1004.633084] nova-compute[62208]: value = "task-38515" [ 1004.633084] nova-compute[62208]: _type = "Task" [ 1004.633084] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1004.637075] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1004.637075] nova-compute[62208]: warnings.warn( [ 1004.645899] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Task: {'id': task-38515, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1005.137516] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1005.137516] nova-compute[62208]: warnings.warn( [ 1005.143727] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1005.144072] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1005.144643] nova-compute[62208]: ERROR nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1005.144643] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1005.144643] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Traceback (most recent call last): [ 1005.144643] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1005.144643] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] yield resources [ 1005.144643] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1005.144643] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] self.driver.spawn(context, instance, image_meta, [ 1005.144643] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1005.144643] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1005.144643] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1005.144643] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] self._fetch_image_if_missing(context, vi) [ 1005.144643] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1005.145391] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] image_cache(vi, tmp_image_ds_loc) [ 1005.145391] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1005.145391] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] vm_util.copy_virtual_disk( [ 1005.145391] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1005.145391] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] session._wait_for_task(vmdk_copy_task) [ 1005.145391] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1005.145391] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] return self.wait_for_task(task_ref) [ 1005.145391] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1005.145391] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] return evt.wait() [ 1005.145391] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1005.145391] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] result = hub.switch() [ 1005.145391] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1005.145391] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] return self.greenlet.switch() [ 1005.146338] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1005.146338] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] self.f(*self.args, **self.kw) [ 1005.146338] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1005.146338] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] raise exceptions.translate_fault(task_info.error) [ 1005.146338] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1005.146338] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Faults: ['InvalidArgument'] [ 1005.146338] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] [ 1005.146338] nova-compute[62208]: INFO nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Terminating instance [ 1005.146930] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1005.147159] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1005.147411] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c7beee27-97a5-4c7c-834b-7389ae2a9d2a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.151077] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquiring lock "refresh_cache-9f48db49-1618-4b04-88a6-315c0f9b889a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1005.151077] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquired lock "refresh_cache-9f48db49-1618-4b04-88a6-315c0f9b889a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1005.151077] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1005.151843] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1005.151843] nova-compute[62208]: warnings.warn( [ 1005.158554] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1005.158821] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1005.160208] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5d84606f-8a40-41e2-80a0-26f3d4c615c8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.165593] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1005.165593] nova-compute[62208]: warnings.warn( [ 1005.170350] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Waiting for the task: (returnval){ [ 1005.170350] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522d3d95-fa47-7aa9-38d6-f96a135abd6c" [ 1005.170350] nova-compute[62208]: _type = "Task" [ 1005.170350] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1005.174506] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1005.174506] nova-compute[62208]: warnings.warn( [ 1005.179874] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522d3d95-fa47-7aa9-38d6-f96a135abd6c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1005.218114] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1005.309585] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1005.325439] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Releasing lock "refresh_cache-9f48db49-1618-4b04-88a6-315c0f9b889a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1005.325892] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1005.326072] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1005.327227] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c57eec3-56f5-40b6-9c35-e456cb701350 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.333712] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1005.333712] nova-compute[62208]: warnings.warn( [ 1005.339581] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1005.339581] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8be3b2e4-d8a2-4e29-9849-3cd8dd8b3064 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.341215] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1005.341215] nova-compute[62208]: warnings.warn( [ 1005.370676] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1005.371131] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1005.371478] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Deleting the datastore file [datastore2] 9f48db49-1618-4b04-88a6-315c0f9b889a {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1005.371858] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-100ddb50-c7ab-4db8-b1a6-734a7faaafc6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.121314] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1006.121314] nova-compute[62208]: warnings.warn( [ 1006.121739] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1006.121739] nova-compute[62208]: warnings.warn( [ 1006.128603] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1006.128989] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Creating directory with path [datastore2] vmware_temp/04727f37-a871-4352-8c34-3d0ea1d87517/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1006.129261] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Waiting for the task: (returnval){ [ 1006.129261] nova-compute[62208]: value = "task-38517" [ 1006.129261] nova-compute[62208]: _type = "Task" [ 1006.129261] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1006.129437] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d900d0be-dbe2-4a87-b239-c99af344636a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.133903] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1006.133903] nova-compute[62208]: warnings.warn( [ 1006.134235] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1006.134235] nova-compute[62208]: warnings.warn( [ 1006.140399] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Task: {'id': task-38517, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1006.153662] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Created directory with path [datastore2] vmware_temp/04727f37-a871-4352-8c34-3d0ea1d87517/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1006.153870] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Fetch image to [datastore2] vmware_temp/04727f37-a871-4352-8c34-3d0ea1d87517/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1006.154034] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/04727f37-a871-4352-8c34-3d0ea1d87517/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1006.155012] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f95150e-8026-4b37-abdd-4debb225c424 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.157612] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1006.157612] nova-compute[62208]: warnings.warn( [ 1006.163040] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-468042dd-aeed-4184-87a0-83694a51abc7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.165731] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1006.165731] nova-compute[62208]: warnings.warn( [ 1006.174592] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31582ab1-b4c0-4c66-a69f-d139861c1051 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.179016] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1006.179016] nova-compute[62208]: warnings.warn( [ 1006.207906] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-064dc6bc-fcb2-472e-8ff7-1eab04a7e2a4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.210689] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1006.210689] nova-compute[62208]: warnings.warn( [ 1006.215207] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f2777834-b03d-44a1-a116-13a1464baeec {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.217018] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1006.217018] nova-compute[62208]: warnings.warn( [ 1006.243918] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1006.308692] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/04727f37-a871-4352-8c34-3d0ea1d87517/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1006.357697] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1006.357932] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/04727f37-a871-4352-8c34-3d0ea1d87517/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1006.635691] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1006.635691] nova-compute[62208]: warnings.warn( [ 1006.641774] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Task: {'id': task-38517, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.049646} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1006.642033] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1006.642214] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1006.642380] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1006.642545] nova-compute[62208]: INFO nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Took 1.32 seconds to destroy the instance on the hypervisor. [ 1006.642794] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1006.642986] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Skipping network deallocation for instance since networking was not requested. {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2276}} [ 1006.645671] nova-compute[62208]: DEBUG nova.compute.claims [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Aborting claim: <nova.compute.claims.Claim object at 0x7fb937200700> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1006.645846] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1006.646064] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1007.223807] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e003749c-ee6d-4730-bb2b-0515f20176cb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.226696] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1007.226696] nova-compute[62208]: warnings.warn( [ 1007.232416] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3cca5ca-a965-4a5f-975f-7284902234ca {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.235757] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1007.235757] nova-compute[62208]: warnings.warn( [ 1007.263931] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a59d811-83ec-47b4-839b-109e4e8221bf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.266581] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1007.266581] nova-compute[62208]: warnings.warn( [ 1007.272445] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9780ecbf-6fcc-41df-9789-9d940b87164e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.276257] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1007.276257] nova-compute[62208]: warnings.warn( [ 1007.287999] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1007.296762] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1007.322820] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.677s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1007.323377] nova-compute[62208]: ERROR nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1007.323377] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1007.323377] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Traceback (most recent call last): [ 1007.323377] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1007.323377] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] self.driver.spawn(context, instance, image_meta, [ 1007.323377] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1007.323377] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1007.323377] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1007.323377] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] self._fetch_image_if_missing(context, vi) [ 1007.323377] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1007.323377] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] image_cache(vi, tmp_image_ds_loc) [ 1007.323377] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1007.323755] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] vm_util.copy_virtual_disk( [ 1007.323755] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1007.323755] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] session._wait_for_task(vmdk_copy_task) [ 1007.323755] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1007.323755] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] return self.wait_for_task(task_ref) [ 1007.323755] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1007.323755] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] return evt.wait() [ 1007.323755] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1007.323755] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] result = hub.switch() [ 1007.323755] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1007.323755] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] return self.greenlet.switch() [ 1007.323755] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1007.323755] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] self.f(*self.args, **self.kw) [ 1007.324127] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1007.324127] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] raise exceptions.translate_fault(task_info.error) [ 1007.324127] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1007.324127] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Faults: ['InvalidArgument'] [ 1007.324127] nova-compute[62208]: ERROR nova.compute.manager [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] [ 1007.325042] nova-compute[62208]: DEBUG nova.compute.utils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1007.326737] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Build of instance 9f48db49-1618-4b04-88a6-315c0f9b889a was re-scheduled: A specified parameter was not correct: fileType [ 1007.326737] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1007.327132] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1007.327358] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquiring lock "refresh_cache-9f48db49-1618-4b04-88a6-315c0f9b889a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1007.327505] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquired lock "refresh_cache-9f48db49-1618-4b04-88a6-315c0f9b889a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1007.327715] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1007.496397] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1007.614468] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1007.633488] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Releasing lock "refresh_cache-9f48db49-1618-4b04-88a6-315c0f9b889a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1007.633721] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1007.633908] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Skipping network deallocation for instance since networking was not requested. {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2276}} [ 1007.787228] nova-compute[62208]: INFO nova.scheduler.client.report [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Deleted allocations for instance 9f48db49-1618-4b04-88a6-315c0f9b889a [ 1007.817492] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d0c95ed-7dc2-406c-a2dd-386f9ef34c62 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Lock "9f48db49-1618-4b04-88a6-315c0f9b889a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 404.875s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1007.820037] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Lock "9f48db49-1618-4b04-88a6-315c0f9b889a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 207.272s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1007.820261] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquiring lock "9f48db49-1618-4b04-88a6-315c0f9b889a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1007.820462] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Lock "9f48db49-1618-4b04-88a6-315c0f9b889a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1007.820623] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Lock "9f48db49-1618-4b04-88a6-315c0f9b889a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1007.822593] nova-compute[62208]: INFO nova.compute.manager [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Terminating instance [ 1007.824535] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquiring lock "refresh_cache-9f48db49-1618-4b04-88a6-315c0f9b889a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1007.824684] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Acquired lock "refresh_cache-9f48db49-1618-4b04-88a6-315c0f9b889a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1007.824919] nova-compute[62208]: DEBUG nova.network.neutron [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1007.840494] nova-compute[62208]: DEBUG nova.compute.manager [None req-f98d9b3b-de29-464b-ab9a-9c5055325504 tempest-ServerShowV247Test-1495422308 tempest-ServerShowV247Test-1495422308-project-member] [instance: b619a949-11d4-4178-9424-54841ee6c26e] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1007.874246] nova-compute[62208]: DEBUG nova.compute.manager [None req-f98d9b3b-de29-464b-ab9a-9c5055325504 tempest-ServerShowV247Test-1495422308 tempest-ServerShowV247Test-1495422308-project-member] [instance: b619a949-11d4-4178-9424-54841ee6c26e] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1007.906416] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f98d9b3b-de29-464b-ab9a-9c5055325504 tempest-ServerShowV247Test-1495422308 tempest-ServerShowV247Test-1495422308-project-member] Lock "b619a949-11d4-4178-9424-54841ee6c26e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 239.351s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1007.920284] nova-compute[62208]: DEBUG nova.compute.manager [None req-e50d2ada-6c42-419f-a7a5-5b2dc8e5907c tempest-ServerShowV247Test-1495422308 tempest-ServerShowV247Test-1495422308-project-member] [instance: ecd5e5ca-bcef-45ed-b7b5-fd572d80bd50] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1007.949802] nova-compute[62208]: DEBUG nova.compute.manager [None req-e50d2ada-6c42-419f-a7a5-5b2dc8e5907c tempest-ServerShowV247Test-1495422308 tempest-ServerShowV247Test-1495422308-project-member] [instance: ecd5e5ca-bcef-45ed-b7b5-fd572d80bd50] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1007.970170] nova-compute[62208]: DEBUG nova.network.neutron [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1007.974753] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e50d2ada-6c42-419f-a7a5-5b2dc8e5907c tempest-ServerShowV247Test-1495422308 tempest-ServerShowV247Test-1495422308-project-member] Lock "ecd5e5ca-bcef-45ed-b7b5-fd572d80bd50" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 239.050s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1008.002945] nova-compute[62208]: DEBUG nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1008.060400] nova-compute[62208]: DEBUG nova.network.neutron [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1008.070566] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1008.071021] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1008.072819] nova-compute[62208]: INFO nova.compute.claims [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1008.084770] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Releasing lock "refresh_cache-9f48db49-1618-4b04-88a6-315c0f9b889a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1008.085192] nova-compute[62208]: DEBUG nova.compute.manager [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1008.085387] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1008.087382] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5fe454e7-f1ad-4855-a2dd-370a971bc81e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.088559] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1008.088559] nova-compute[62208]: warnings.warn( [ 1008.096265] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57fad070-7858-4b30-ad71-9e24ca1daef0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.110811] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1008.110811] nova-compute[62208]: warnings.warn( [ 1008.136934] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9f48db49-1618-4b04-88a6-315c0f9b889a could not be found. [ 1008.137354] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1008.137728] nova-compute[62208]: INFO nova.compute.manager [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1008.138117] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1008.140638] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1008.140909] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1008.184915] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1008.194748] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1008.207668] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 9f48db49-1618-4b04-88a6-315c0f9b889a] Took 0.07 seconds to deallocate network for instance. [ 1008.358777] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ef48cd50-4bbe-4f91-9212-a7efd609dcd8 tempest-ServerDiagnosticsV248Test-773836771 tempest-ServerDiagnosticsV248Test-773836771-project-member] Lock "9f48db49-1618-4b04-88a6-315c0f9b889a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.539s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1008.688389] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28515f00-662d-4600-a357-4dbe6fd37c72 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.691030] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1008.691030] nova-compute[62208]: warnings.warn( [ 1008.697120] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9db9555d-f687-43e1-a5d8-2e473103c052 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.700408] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1008.700408] nova-compute[62208]: warnings.warn( [ 1008.734489] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cacd17dc-25e6-4bb3-8e19-0e68f521279e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.735222] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1008.735222] nova-compute[62208]: warnings.warn( [ 1008.743214] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3eb21b8b-0644-43a5-bc32-091ea15ffe6c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.749619] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1008.749619] nova-compute[62208]: warnings.warn( [ 1008.763571] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1008.774108] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1008.791020] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.720s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1008.791403] nova-compute[62208]: DEBUG nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1008.836984] nova-compute[62208]: DEBUG nova.compute.utils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1008.838433] nova-compute[62208]: DEBUG nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1008.838612] nova-compute[62208]: DEBUG nova.network.neutron [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1008.852132] nova-compute[62208]: DEBUG nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1008.901029] nova-compute[62208]: DEBUG nova.network.neutron [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] No network configured {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1188}} [ 1008.901225] nova-compute[62208]: DEBUG nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Instance network_info: |[]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1008.952846] nova-compute[62208]: DEBUG nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1008.979784] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:10:25Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='683ec6ff-8878-4ad7-8716-c56b68d1a090',id=38,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-580518149',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1008.980047] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1008.980210] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1008.980396] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1008.980545] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1008.980693] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1008.980903] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1008.981058] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1008.981223] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1008.981384] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1008.981557] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1008.982712] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab8df474-31f7-4377-ba44-0992310fb233 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.985875] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1008.985875] nova-compute[62208]: warnings.warn( [ 1008.991549] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55f024e3-f2ef-4ccd-abfc-592bf58e686c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.001356] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1009.001356] nova-compute[62208]: warnings.warn( [ 1009.018328] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1009.023957] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Creating folder: Project (8745bead67214fc6b4727d0a0b10708c). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1009.024326] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-81c2acf9-6dce-48c7-bb1f-8d7c98e73033 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.026561] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1009.026561] nova-compute[62208]: warnings.warn( [ 1009.038162] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Created folder: Project (8745bead67214fc6b4727d0a0b10708c) in parent group-v17427. [ 1009.038162] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Creating folder: Instances. Parent ref: group-v17507. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1009.038162] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-57fb7707-e51a-40b4-901a-f87b7bd5d5b1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.039617] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1009.039617] nova-compute[62208]: warnings.warn( [ 1009.048444] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Created folder: Instances in parent group-v17507. [ 1009.049017] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1009.049825] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1009.050207] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8fc717ff-6ea2-4f98-9b79-e6d4a0ada521 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.066494] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1009.066494] nova-compute[62208]: warnings.warn( [ 1009.080344] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1009.080344] nova-compute[62208]: value = "task-38520" [ 1009.080344] nova-compute[62208]: _type = "Task" [ 1009.080344] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1009.086607] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1009.086607] nova-compute[62208]: warnings.warn( [ 1009.093831] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38520, 'name': CreateVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1009.584334] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1009.584334] nova-compute[62208]: warnings.warn( [ 1009.590798] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38520, 'name': CreateVM_Task, 'duration_secs': 0.286606} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1009.591050] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1009.591403] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1009.591680] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1009.595237] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8be7ebf5-66ad-4e92-9228-79137dcd28ee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.606068] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1009.606068] nova-compute[62208]: warnings.warn( [ 1009.631276] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Reconfiguring VM instance to enable vnc on port - 5908 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1009.631276] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-5c7963c5-90f0-4559-b981-24cf85b40527 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.642677] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1009.642677] nova-compute[62208]: warnings.warn( [ 1009.650398] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Waiting for the task: (returnval){ [ 1009.650398] nova-compute[62208]: value = "task-38521" [ 1009.650398] nova-compute[62208]: _type = "Task" [ 1009.650398] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1009.653764] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1009.653764] nova-compute[62208]: warnings.warn( [ 1009.661155] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Task: {'id': task-38521, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1010.154583] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1010.154583] nova-compute[62208]: warnings.warn( [ 1010.160668] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Task: {'id': task-38521, 'name': ReconfigVM_Task, 'duration_secs': 0.119477} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1010.161067] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Reconfigured VM instance to enable vnc on port - 5908 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1010.161154] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.570s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1010.161394] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1010.161534] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1010.161842] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1010.162099] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e3e3aff1-717e-4287-b06b-04fc8ced1ede {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.163711] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1010.163711] nova-compute[62208]: warnings.warn( [ 1010.167326] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Waiting for the task: (returnval){ [ 1010.167326] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]524611ae-f3ea-9e49-a2ac-2d33a2c96766" [ 1010.167326] nova-compute[62208]: _type = "Task" [ 1010.167326] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1010.171692] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1010.171692] nova-compute[62208]: warnings.warn( [ 1010.176729] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]524611ae-f3ea-9e49-a2ac-2d33a2c96766, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1010.528238] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquiring lock "17dbfb9d-4ec1-4937-8bb7-343101f8f61b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1010.672998] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1010.672998] nova-compute[62208]: warnings.warn( [ 1010.679728] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1010.679990] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1010.680215] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1012.098760] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1012.099028] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1014.871549] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f16f15a8-06f0-4368-8c66-a91235282f03 tempest-ServerActionsTestOtherB-1030249969 tempest-ServerActionsTestOtherB-1030249969-project-member] Acquiring lock "b21f3c77-3118-4229-96b4-7e242f0d5ad5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1014.872658] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f16f15a8-06f0-4368-8c66-a91235282f03 tempest-ServerActionsTestOtherB-1030249969 tempest-ServerActionsTestOtherB-1030249969-project-member] Lock "b21f3c77-3118-4229-96b4-7e242f0d5ad5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1015.715545] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] Acquiring lock "38c6aba9-f3ad-475b-b0d2-3feb30073826" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1015.715788] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] Lock "38c6aba9-f3ad-475b-b0d2-3feb30073826" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1015.757709] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] Acquiring lock "706adf2f-f3a5-4c70-bdaa-a31911d3fda8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1015.757954] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] Lock "706adf2f-f3a5-4c70-bdaa-a31911d3fda8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1015.801689] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] Acquiring lock "228a512d-ee17-440e-88e2-023af55853c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1015.806727] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] Lock "228a512d-ee17-440e-88e2-023af55853c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1016.290767] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9f62ef70-5f8f-429b-848b-c8c763b758c5 tempest-InstanceActionsTestJSON-1658190985 tempest-InstanceActionsTestJSON-1658190985-project-member] Acquiring lock "3c0deb5b-4eeb-45fb-a248-187734fb8820" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1016.291067] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9f62ef70-5f8f-429b-848b-c8c763b758c5 tempest-InstanceActionsTestJSON-1658190985 tempest-InstanceActionsTestJSON-1658190985-project-member] Lock "3c0deb5b-4eeb-45fb-a248-187734fb8820" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1017.042878] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10a457f4-7093-4c34-96ef-2551f63550ce tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "5fb62e8e-15de-4e18-a8e9-a5507e29a4bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1017.043380] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10a457f4-7093-4c34-96ef-2551f63550ce tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "5fb62e8e-15de-4e18-a8e9-a5507e29a4bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1020.764063] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b2ecdd67-6d41-43a6-a4a6-9f9e52580430 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "bed3e7fa-24a1-4399-82de-4a128d655376" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1020.764314] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b2ecdd67-6d41-43a6-a4a6-9f9e52580430 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "bed3e7fa-24a1-4399-82de-4a128d655376" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1026.720280] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0c18eb50-6f0e-4caf-babe-32d8a6a2ca81 tempest-ServerGroupTestJSON-2076335407 tempest-ServerGroupTestJSON-2076335407-project-member] Acquiring lock "e01f0362-0594-471f-a5f9-b444fb774606" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1026.720917] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0c18eb50-6f0e-4caf-babe-32d8a6a2ca81 tempest-ServerGroupTestJSON-2076335407 tempest-ServerGroupTestJSON-2076335407-project-member] Lock "e01f0362-0594-471f-a5f9-b444fb774606" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1055.460942] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1055.460942] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1055.460942] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1055.460942] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1055.460942] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1055.460942] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1055.460942] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1055.460942] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1055.460942] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1055.460942] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1055.460942] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1055.460942] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1055.461668] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/04727f37-a871-4352-8c34-3d0ea1d87517/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1055.463501] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1055.463868] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Copying Virtual Disk [datastore2] vmware_temp/04727f37-a871-4352-8c34-3d0ea1d87517/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/04727f37-a871-4352-8c34-3d0ea1d87517/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1055.464192] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e5d7b778-1f21-4c34-83ce-dcc84ca56cb1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1055.467094] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1055.467094] nova-compute[62208]: warnings.warn( [ 1055.472515] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Waiting for the task: (returnval){ [ 1055.472515] nova-compute[62208]: value = "task-38522" [ 1055.472515] nova-compute[62208]: _type = "Task" [ 1055.472515] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1055.475974] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1055.475974] nova-compute[62208]: warnings.warn( [ 1055.481704] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Task: {'id': task-38522, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1055.976808] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1055.976808] nova-compute[62208]: warnings.warn( [ 1055.983551] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1055.983866] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1055.984615] nova-compute[62208]: ERROR nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1055.984615] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1055.984615] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Traceback (most recent call last): [ 1055.984615] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1055.984615] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] yield resources [ 1055.984615] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1055.984615] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] self.driver.spawn(context, instance, image_meta, [ 1055.984615] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1055.984615] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1055.984615] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1055.984615] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] self._fetch_image_if_missing(context, vi) [ 1055.984615] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1055.985189] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] image_cache(vi, tmp_image_ds_loc) [ 1055.985189] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1055.985189] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] vm_util.copy_virtual_disk( [ 1055.985189] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1055.985189] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] session._wait_for_task(vmdk_copy_task) [ 1055.985189] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1055.985189] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] return self.wait_for_task(task_ref) [ 1055.985189] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1055.985189] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] return evt.wait() [ 1055.985189] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1055.985189] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] result = hub.switch() [ 1055.985189] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1055.985189] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] return self.greenlet.switch() [ 1055.985572] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1055.985572] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] self.f(*self.args, **self.kw) [ 1055.985572] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1055.985572] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] raise exceptions.translate_fault(task_info.error) [ 1055.985572] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1055.985572] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Faults: ['InvalidArgument'] [ 1055.985572] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] [ 1055.985572] nova-compute[62208]: INFO nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Terminating instance [ 1055.991429] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquiring lock "refresh_cache-858b585c-7746-4d38-84c9-b3ee719eb406" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1055.991617] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquired lock "refresh_cache-858b585c-7746-4d38-84c9-b3ee719eb406" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1055.991789] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1055.996305] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1055.996543] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1055.996805] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b0f8714f-2bf5-4230-87b9-17d447921630 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.010003] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.010003] nova-compute[62208]: warnings.warn( [ 1056.019754] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1056.019941] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1056.020697] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b4ad5872-a80d-415f-be01-8d453525b014 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.023065] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.023065] nova-compute[62208]: warnings.warn( [ 1056.027205] nova-compute[62208]: DEBUG oslo_vmware.api [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Waiting for the task: (returnval){ [ 1056.027205] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52b7964c-4437-0e52-860e-029e687e0989" [ 1056.027205] nova-compute[62208]: _type = "Task" [ 1056.027205] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1056.030332] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.030332] nova-compute[62208]: warnings.warn( [ 1056.035522] nova-compute[62208]: DEBUG oslo_vmware.api [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52b7964c-4437-0e52-860e-029e687e0989, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1056.041602] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1056.097390] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1056.106677] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Releasing lock "refresh_cache-858b585c-7746-4d38-84c9-b3ee719eb406" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1056.107083] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1056.107272] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1056.108401] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31b9cb9a-8cd4-420e-8de9-9ed99ba22ca8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.111290] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.111290] nova-compute[62208]: warnings.warn( [ 1056.116867] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1056.121505] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c862c0a0-1a9f-44ed-9c39-7287fa3daf32 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.122515] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.122515] nova-compute[62208]: warnings.warn( [ 1056.140946] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1056.157908] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1056.158137] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1056.158320] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Deleting the datastore file [datastore2] 858b585c-7746-4d38-84c9-b3ee719eb406 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1056.158578] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-84551c9b-fc8e-4d0e-a32a-bbbcce6ed789 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.160485] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.160485] nova-compute[62208]: warnings.warn( [ 1056.165357] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Waiting for the task: (returnval){ [ 1056.165357] nova-compute[62208]: value = "task-38524" [ 1056.165357] nova-compute[62208]: _type = "Task" [ 1056.165357] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1056.168701] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.168701] nova-compute[62208]: warnings.warn( [ 1056.173475] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Task: {'id': task-38524, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1056.530908] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.530908] nova-compute[62208]: warnings.warn( [ 1056.538174] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1056.538436] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Creating directory with path [datastore2] vmware_temp/efc24ef7-a759-4f62-9445-e4f09f155e3c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1056.538682] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-851b123f-db20-42d1-9f77-cc492e3f15f1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.540499] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.540499] nova-compute[62208]: warnings.warn( [ 1056.551242] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Created directory with path [datastore2] vmware_temp/efc24ef7-a759-4f62-9445-e4f09f155e3c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1056.551433] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Fetch image to [datastore2] vmware_temp/efc24ef7-a759-4f62-9445-e4f09f155e3c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1056.551607] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/efc24ef7-a759-4f62-9445-e4f09f155e3c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1056.552600] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6007280e-ad4f-4999-984a-6180923d171c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.554969] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.554969] nova-compute[62208]: warnings.warn( [ 1056.560111] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d020f7f-745d-4cad-a190-65fe2f38e4d7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.562606] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.562606] nova-compute[62208]: warnings.warn( [ 1056.572256] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85582641-1e87-4460-9b43-9fe5aefffe73 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.576204] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.576204] nova-compute[62208]: warnings.warn( [ 1056.606530] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bad98915-e630-4bdd-bd51-1b8ecc4cf33c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.609145] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.609145] nova-compute[62208]: warnings.warn( [ 1056.613995] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-42c2ac78-4aaf-4db8-87b5-99a6907ba8f4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.615654] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.615654] nova-compute[62208]: warnings.warn( [ 1056.635843] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1056.676983] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1056.676983] nova-compute[62208]: warnings.warn( [ 1056.686383] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Task: {'id': task-38524, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.034362} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1056.686884] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1056.687079] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1056.687254] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1056.687460] nova-compute[62208]: INFO nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Took 0.58 seconds to destroy the instance on the hypervisor. [ 1056.687737] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1056.687881] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1056.687978] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1056.706798] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/efc24ef7-a759-4f62-9445-e4f09f155e3c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1056.708773] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1056.760521] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1056.764741] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1056.764921] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/efc24ef7-a759-4f62-9445-e4f09f155e3c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1056.770394] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Took 0.08 seconds to deallocate network for instance. [ 1056.772859] nova-compute[62208]: DEBUG nova.compute.claims [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936c10100> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1056.773138] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1056.773686] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1057.142006] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1057.233381] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c030e006-7dec-4eb5-bd26-81449d40abe4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.235910] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1057.235910] nova-compute[62208]: warnings.warn( [ 1057.241206] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c7763eb-1a6f-4187-bc30-02690f3df58e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.245652] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1057.245652] nova-compute[62208]: warnings.warn( [ 1057.272965] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fa10857-1b64-499a-b94b-aaa00b7338fd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.275552] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1057.275552] nova-compute[62208]: warnings.warn( [ 1057.281687] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e4db3e2-12fb-413a-b14e-0fc5053da27f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.285894] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1057.285894] nova-compute[62208]: warnings.warn( [ 1057.302572] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1057.312217] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1057.330428] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.557s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1057.330986] nova-compute[62208]: ERROR nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1057.330986] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1057.330986] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Traceback (most recent call last): [ 1057.330986] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1057.330986] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] self.driver.spawn(context, instance, image_meta, [ 1057.330986] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1057.330986] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1057.330986] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1057.330986] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] self._fetch_image_if_missing(context, vi) [ 1057.330986] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1057.330986] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] image_cache(vi, tmp_image_ds_loc) [ 1057.330986] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1057.331347] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] vm_util.copy_virtual_disk( [ 1057.331347] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1057.331347] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] session._wait_for_task(vmdk_copy_task) [ 1057.331347] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1057.331347] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] return self.wait_for_task(task_ref) [ 1057.331347] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1057.331347] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] return evt.wait() [ 1057.331347] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1057.331347] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] result = hub.switch() [ 1057.331347] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1057.331347] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] return self.greenlet.switch() [ 1057.331347] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1057.331347] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] self.f(*self.args, **self.kw) [ 1057.331838] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1057.331838] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] raise exceptions.translate_fault(task_info.error) [ 1057.331838] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1057.331838] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Faults: ['InvalidArgument'] [ 1057.331838] nova-compute[62208]: ERROR nova.compute.manager [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] [ 1057.331838] nova-compute[62208]: DEBUG nova.compute.utils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1057.333547] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Build of instance 858b585c-7746-4d38-84c9-b3ee719eb406 was re-scheduled: A specified parameter was not correct: fileType [ 1057.333547] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1057.333955] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1057.334185] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquiring lock "refresh_cache-858b585c-7746-4d38-84c9-b3ee719eb406" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1057.334744] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquired lock "refresh_cache-858b585c-7746-4d38-84c9-b3ee719eb406" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1057.334744] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1057.387064] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1057.434448] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1057.447881] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Releasing lock "refresh_cache-858b585c-7746-4d38-84c9-b3ee719eb406" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1057.447881] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1057.448181] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1057.448274] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1057.471992] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1057.483526] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1057.492483] nova-compute[62208]: INFO nova.compute.manager [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Took 0.04 seconds to deallocate network for instance. [ 1057.594429] nova-compute[62208]: INFO nova.scheduler.client.report [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Deleted allocations for instance 858b585c-7746-4d38-84c9-b3ee719eb406 [ 1057.617516] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d104b1f-10cf-4107-9c6e-e3985b68caca tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Lock "858b585c-7746-4d38-84c9-b3ee719eb406" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 450.287s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1057.619142] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Lock "858b585c-7746-4d38-84c9-b3ee719eb406" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 252.377s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1057.619452] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquiring lock "858b585c-7746-4d38-84c9-b3ee719eb406-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1057.619773] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Lock "858b585c-7746-4d38-84c9-b3ee719eb406-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1057.620140] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Lock "858b585c-7746-4d38-84c9-b3ee719eb406-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1057.623261] nova-compute[62208]: INFO nova.compute.manager [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Terminating instance [ 1057.625002] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquiring lock "refresh_cache-858b585c-7746-4d38-84c9-b3ee719eb406" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1057.625305] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Acquired lock "refresh_cache-858b585c-7746-4d38-84c9-b3ee719eb406" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1057.625483] nova-compute[62208]: DEBUG nova.network.neutron [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1057.634533] nova-compute[62208]: DEBUG nova.compute.manager [None req-28272ba8-771e-4e85-8870-a1ad850a7bee tempest-InstanceActionsNegativeTestJSON-262890172 tempest-InstanceActionsNegativeTestJSON-262890172-project-member] [instance: 805622b8-0c67-464e-a666-bd553818e796] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1057.659330] nova-compute[62208]: DEBUG nova.network.neutron [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1057.664183] nova-compute[62208]: DEBUG nova.compute.manager [None req-28272ba8-771e-4e85-8870-a1ad850a7bee tempest-InstanceActionsNegativeTestJSON-262890172 tempest-InstanceActionsNegativeTestJSON-262890172-project-member] [instance: 805622b8-0c67-464e-a666-bd553818e796] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1057.693308] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-28272ba8-771e-4e85-8870-a1ad850a7bee tempest-InstanceActionsNegativeTestJSON-262890172 tempest-InstanceActionsNegativeTestJSON-262890172-project-member] Lock "805622b8-0c67-464e-a666-bd553818e796" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 235.924s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1057.706605] nova-compute[62208]: DEBUG nova.compute.manager [None req-3fe8377c-a8f3-4bdb-9564-cb84ef68bf51 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 0b350d21-7644-49d1-a3d6-f2c069de2f0c] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1057.737297] nova-compute[62208]: DEBUG nova.compute.manager [None req-3fe8377c-a8f3-4bdb-9564-cb84ef68bf51 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 0b350d21-7644-49d1-a3d6-f2c069de2f0c] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1057.740561] nova-compute[62208]: DEBUG nova.network.neutron [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1057.748771] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Releasing lock "refresh_cache-858b585c-7746-4d38-84c9-b3ee719eb406" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1057.749217] nova-compute[62208]: DEBUG nova.compute.manager [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1057.749420] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1057.750051] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-52da5fb0-7b54-465a-b12a-661cf878a465 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.758744] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1057.758744] nova-compute[62208]: warnings.warn( [ 1057.764542] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3fe8377c-a8f3-4bdb-9564-cb84ef68bf51 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "0b350d21-7644-49d1-a3d6-f2c069de2f0c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 235.403s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1057.767436] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e55740ac-1ab0-4c23-bdee-4e6d25c0a85e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.779815] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1057.779815] nova-compute[62208]: warnings.warn( [ 1057.799613] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 858b585c-7746-4d38-84c9-b3ee719eb406 could not be found. [ 1057.799996] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1057.800283] nova-compute[62208]: INFO nova.compute.manager [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1057.800578] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1057.800855] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1057.800990] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1057.806283] nova-compute[62208]: DEBUG nova.compute.manager [None req-3a4e1cbd-0872-42c0-b5e9-db36baacbfc8 tempest-VolumesAdminNegativeTest-420285072 tempest-VolumesAdminNegativeTest-420285072-project-member] [instance: c3083789-9915-42dc-9345-22dabdbec850] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1057.821402] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1057.828484] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1057.835857] nova-compute[62208]: DEBUG nova.compute.manager [None req-3a4e1cbd-0872-42c0-b5e9-db36baacbfc8 tempest-VolumesAdminNegativeTest-420285072 tempest-VolumesAdminNegativeTest-420285072-project-member] [instance: c3083789-9915-42dc-9345-22dabdbec850] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1057.840583] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 858b585c-7746-4d38-84c9-b3ee719eb406] Took 0.04 seconds to deallocate network for instance. [ 1057.860136] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3a4e1cbd-0872-42c0-b5e9-db36baacbfc8 tempest-VolumesAdminNegativeTest-420285072 tempest-VolumesAdminNegativeTest-420285072-project-member] Lock "c3083789-9915-42dc-9345-22dabdbec850" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 222.078s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1057.871649] nova-compute[62208]: DEBUG nova.compute.manager [None req-b1df3a2e-5adc-4a3b-893e-29e93f36e31f tempest-ServerTagsTestJSON-1796243761 tempest-ServerTagsTestJSON-1796243761-project-member] [instance: 235a9ab6-3be0-4205-bdb9-8b85c93f0846] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1057.908424] nova-compute[62208]: DEBUG nova.compute.manager [None req-b1df3a2e-5adc-4a3b-893e-29e93f36e31f tempest-ServerTagsTestJSON-1796243761 tempest-ServerTagsTestJSON-1796243761-project-member] [instance: 235a9ab6-3be0-4205-bdb9-8b85c93f0846] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1057.941205] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b1df3a2e-5adc-4a3b-893e-29e93f36e31f tempest-ServerTagsTestJSON-1796243761 tempest-ServerTagsTestJSON-1796243761-project-member] Lock "235a9ab6-3be0-4205-bdb9-8b85c93f0846" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 218.501s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1057.954819] nova-compute[62208]: DEBUG nova.compute.manager [None req-c0855b1f-d83f-4fc0-991c-cd87054ae271 tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] [instance: 9db515f2-7484-4478-86e0-2e715e08646a] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1057.972360] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c5587406-a8be-41fa-a0e7-c136c3b8b162 tempest-TenantUsagesTestJSON-1774783593 tempest-TenantUsagesTestJSON-1774783593-project-member] Lock "858b585c-7746-4d38-84c9-b3ee719eb406" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.353s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1057.981702] nova-compute[62208]: DEBUG nova.compute.manager [None req-c0855b1f-d83f-4fc0-991c-cd87054ae271 tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] [instance: 9db515f2-7484-4478-86e0-2e715e08646a] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1058.003442] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c0855b1f-d83f-4fc0-991c-cd87054ae271 tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] Lock "9db515f2-7484-4478-86e0-2e715e08646a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 207.969s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1058.017159] nova-compute[62208]: DEBUG nova.compute.manager [None req-c0855b1f-d83f-4fc0-991c-cd87054ae271 tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] [instance: 21ce7403-a26b-452f-948a-2e32c606ce00] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1058.056310] nova-compute[62208]: DEBUG nova.compute.manager [None req-c0855b1f-d83f-4fc0-991c-cd87054ae271 tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] [instance: 21ce7403-a26b-452f-948a-2e32c606ce00] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1058.082117] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c0855b1f-d83f-4fc0-991c-cd87054ae271 tempest-MultipleCreateTestJSON-873674115 tempest-MultipleCreateTestJSON-873674115-project-member] Lock "21ce7403-a26b-452f-948a-2e32c606ce00" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 208.007s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1058.094231] nova-compute[62208]: DEBUG nova.compute.manager [None req-2e2c38d5-9f6b-4929-b9be-c4a293024989 tempest-ServersTestManualDisk-1323072622 tempest-ServersTestManualDisk-1323072622-project-member] [instance: 5672475f-45dd-460d-bb7b-f53dfee798b1] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1058.120367] nova-compute[62208]: DEBUG nova.compute.manager [None req-2e2c38d5-9f6b-4929-b9be-c4a293024989 tempest-ServersTestManualDisk-1323072622 tempest-ServersTestManualDisk-1323072622-project-member] [instance: 5672475f-45dd-460d-bb7b-f53dfee798b1] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1058.141301] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1058.141469] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1058.141590] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1058.144882] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e2c38d5-9f6b-4929-b9be-c4a293024989 tempest-ServersTestManualDisk-1323072622 tempest-ServersTestManualDisk-1323072622-project-member] Lock "5672475f-45dd-460d-bb7b-f53dfee798b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 202.297s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1058.157238] nova-compute[62208]: DEBUG nova.compute.manager [None req-89623a71-b86c-40a6-ba07-065a79f32e60 tempest-ServerActionsV293TestJSON-2076507366 tempest-ServerActionsV293TestJSON-2076507366-project-member] [instance: 61cc043c-1d4a-4a47-86b0-cc4fb61abed4] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1058.162660] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1058.162836] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1058.162940] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1058.163069] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1058.163194] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1058.163316] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1058.163434] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1058.163552] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1058.163671] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1058.163796] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1058.183240] nova-compute[62208]: DEBUG nova.compute.manager [None req-89623a71-b86c-40a6-ba07-065a79f32e60 tempest-ServerActionsV293TestJSON-2076507366 tempest-ServerActionsV293TestJSON-2076507366-project-member] [instance: 61cc043c-1d4a-4a47-86b0-cc4fb61abed4] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1058.208635] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-89623a71-b86c-40a6-ba07-065a79f32e60 tempest-ServerActionsV293TestJSON-2076507366 tempest-ServerActionsV293TestJSON-2076507366-project-member] Lock "61cc043c-1d4a-4a47-86b0-cc4fb61abed4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 198.993s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1058.219354] nova-compute[62208]: DEBUG nova.compute.manager [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1058.273575] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1058.273821] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1058.275283] nova-compute[62208]: INFO nova.compute.claims [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1058.634622] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Acquiring lock "6e798882-aa11-4c1b-891c-7428d2bba113" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1058.800248] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd42a054-5b2d-4f95-ae51-b812aba4b8d9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.802774] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1058.802774] nova-compute[62208]: warnings.warn( [ 1058.808808] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-875e904d-5a10-4a89-8252-2c302bb0fce8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.811661] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1058.811661] nova-compute[62208]: warnings.warn( [ 1058.839081] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb2050b6-ae3d-41b8-b42f-331e50bd431f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.841517] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1058.841517] nova-compute[62208]: warnings.warn( [ 1058.846801] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b384efad-6512-485d-a416-f40a2558fb80 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.851006] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1058.851006] nova-compute[62208]: warnings.warn( [ 1058.860859] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1058.869223] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1058.891042] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.617s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1058.891564] nova-compute[62208]: DEBUG nova.compute.manager [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1058.921501] nova-compute[62208]: DEBUG nova.compute.claims [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9365a7490> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1058.921698] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1058.921926] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1059.140578] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1059.140778] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1059.331931] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abb8f6b2-ff43-4461-beef-85fa55654578 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.334442] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1059.334442] nova-compute[62208]: warnings.warn( [ 1059.339776] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4630a0a-8cb5-4e0c-878d-e72e728d99cb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.343452] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1059.343452] nova-compute[62208]: warnings.warn( [ 1059.370686] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-607741cf-6270-4644-b3b7-bc0a10d0bd05 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.373266] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1059.373266] nova-compute[62208]: warnings.warn( [ 1059.378602] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7db83a19-ccc9-4696-b4bf-ccd7d60e9ffd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.384261] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1059.384261] nova-compute[62208]: warnings.warn( [ 1059.395833] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1059.405208] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1059.424901] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.503s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1059.425710] nova-compute[62208]: DEBUG nova.compute.utils [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Conflict updating instance 6e798882-aa11-4c1b-891c-7428d2bba113. Expected: {'task_state': [None]}. Actual: {'task_state': 'deleting'} {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1059.427275] nova-compute[62208]: DEBUG nova.compute.manager [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Instance disappeared during build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2504}} [ 1059.427475] nova-compute[62208]: DEBUG nova.compute.manager [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1059.427733] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Acquiring lock "refresh_cache-6e798882-aa11-4c1b-891c-7428d2bba113" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1059.427890] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Acquired lock "refresh_cache-6e798882-aa11-4c1b-891c-7428d2bba113" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1059.428062] nova-compute[62208]: DEBUG nova.network.neutron [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1059.466573] nova-compute[62208]: DEBUG nova.network.neutron [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1059.541592] nova-compute[62208]: DEBUG nova.network.neutron [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1059.550956] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Releasing lock "refresh_cache-6e798882-aa11-4c1b-891c-7428d2bba113" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1059.551125] nova-compute[62208]: DEBUG nova.compute.manager [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1059.551308] nova-compute[62208]: DEBUG nova.compute.manager [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1059.551645] nova-compute[62208]: DEBUG nova.network.neutron [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1059.577587] nova-compute[62208]: DEBUG nova.network.neutron [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1059.586568] nova-compute[62208]: DEBUG nova.network.neutron [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1059.597301] nova-compute[62208]: INFO nova.compute.manager [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Took 0.05 seconds to deallocate network for instance. [ 1059.674044] nova-compute[62208]: INFO nova.scheduler.client.report [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Deleted allocations for instance 6e798882-aa11-4c1b-891c-7428d2bba113 [ 1059.674912] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f69e3072-d3cf-4030-8c07-ae7ce2eac715 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Lock "6e798882-aa11-4c1b-891c-7428d2bba113" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 196.708s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1059.676402] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Lock "6e798882-aa11-4c1b-891c-7428d2bba113" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 1.042s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1059.676809] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Acquiring lock "6e798882-aa11-4c1b-891c-7428d2bba113-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1059.677191] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Lock "6e798882-aa11-4c1b-891c-7428d2bba113-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1059.677536] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Lock "6e798882-aa11-4c1b-891c-7428d2bba113-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1059.679645] nova-compute[62208]: INFO nova.compute.manager [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Terminating instance [ 1059.681422] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Acquiring lock "refresh_cache-6e798882-aa11-4c1b-891c-7428d2bba113" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1059.681714] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Acquired lock "refresh_cache-6e798882-aa11-4c1b-891c-7428d2bba113" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1059.681982] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1059.686965] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1059.711670] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1059.742290] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1059.742290] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1059.743182] nova-compute[62208]: INFO nova.compute.claims [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1059.789436] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1059.803103] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Releasing lock "refresh_cache-6e798882-aa11-4c1b-891c-7428d2bba113" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1059.803548] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1059.803741] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1059.804741] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-59941d84-c821-4989-8413-7409b53342ec {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.808058] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1059.808058] nova-compute[62208]: warnings.warn( [ 1059.816700] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4620804f-0ce0-47ea-ab25-23dfcf3c3ae4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.830083] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1059.830083] nova-compute[62208]: warnings.warn( [ 1059.850924] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6e798882-aa11-4c1b-891c-7428d2bba113 could not be found. [ 1059.851053] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1059.851193] nova-compute[62208]: INFO nova.compute.manager [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1059.851429] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1059.851633] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1059.851721] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1059.874331] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1059.882184] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1059.890891] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 6e798882-aa11-4c1b-891c-7428d2bba113] Took 0.04 seconds to deallocate network for instance. [ 1059.989595] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbff722c-4664-4e44-b1fa-d4f486d5db35 tempest-ServerMetadataNegativeTestJSON-1634559369 tempest-ServerMetadataNegativeTestJSON-1634559369-project-member] Lock "6e798882-aa11-4c1b-891c-7428d2bba113" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.313s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.140636] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1060.141521] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6553801-9f97-4c5b-b3f6-809fc5dda760 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.144205] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1060.145192] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1060.145192] nova-compute[62208]: warnings.warn( [ 1060.150881] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd0a1a5b-d67a-4a3c-8fb7-6f5f4f3418bc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.155533] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1060.155740] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1060.155740] nova-compute[62208]: warnings.warn( [ 1060.185275] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89e75fdd-be95-4d11-9a45-f7d57a384ce2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.188296] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1060.188296] nova-compute[62208]: warnings.warn( [ 1060.194019] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94bcc805-872e-488b-a1ef-dd17b149cbcd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.197878] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1060.197878] nova-compute[62208]: warnings.warn( [ 1060.208277] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1060.218258] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1060.236976] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.495s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.237473] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1060.239796] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.084s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1060.239977] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.240176] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1060.241331] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79e8a719-8bdd-4846-83d9-ae87bbbb9de4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.245406] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1060.245406] nova-compute[62208]: warnings.warn( [ 1060.250920] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d79b293-ca98-4d99-aa70-4a5600d63d55 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.254563] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1060.254563] nova-compute[62208]: warnings.warn( [ 1060.265376] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82052ec9-43dd-4dea-bd95-b76e41c2f64b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.267869] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1060.267869] nova-compute[62208]: warnings.warn( [ 1060.273141] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fe14526-46f5-4f98-bf84-8b881c4eb3c2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.278467] nova-compute[62208]: DEBUG nova.compute.utils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1060.280467] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1060.280467] nova-compute[62208]: warnings.warn( [ 1060.281211] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1060.281560] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1060.311620] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181919MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1060.311786] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1060.311911] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1060.314011] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1060.360827] nova-compute[62208]: DEBUG nova.policy [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6cb3f0377ac64412bf238ba3e97ecd9a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4fb2ff705fe34117b2dfb9354ae8cfc8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1060.382499] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d2839fc7-a754-4d4f-a7f1-e4b17f2126a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1060.382670] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 08336643-4254-4447-b7c2-b81054bf9707 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1060.382827] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance af8885cb-afba-4724-be10-083e16f8bfc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1060.382956] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1060.383106] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1060.383222] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1060.383347] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1060.383468] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c4512476-9905-4f33-8575-d0a0f24ed4d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1060.383612] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1060.383723] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d5c7531e-b496-4aed-be05-f1a96391e327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1060.392140] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1060.396515] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b03b2152-c17a-4757-b1ad-8e2fd1430fb3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.409209] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ab417ba8-5304-4dfd-a08e-102a43996d9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.414934] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1060.415210] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1060.415413] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1060.415619] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1060.415818] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1060.416036] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1060.416241] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1060.416409] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1060.416579] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1060.416744] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1060.416918] nova-compute[62208]: DEBUG nova.virt.hardware [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1060.417987] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aea80d0-8c0b-4b94-8259-6890db70d2ad {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.422394] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance a2d62b77-29a3-4813-b4db-782e1aa52834 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.423416] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1060.423416] nova-compute[62208]: warnings.warn( [ 1060.429470] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4c99740-761a-4a05-ba4e-865fc964b99e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.435601] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3cd22f76-71d2-4d03-88c6-10192bb9418e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.436565] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1060.436565] nova-compute[62208]: warnings.warn( [ 1060.447913] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d1a22d6e-d913-47de-9188-507d2475f745 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.458163] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7f79eba6-e15c-4402-b46b-028d552a81d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.471298] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 79777c07-535b-43ca-9d49-7d595da14adc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.482364] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 6f86dbc6-cfa0-422f-8c80-93c47a7764df has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.492689] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8f74dd97-b43c-49ef-8a83-401329ebfbdb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.503057] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b21f3c77-3118-4229-96b4-7e242f0d5ad5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.516459] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 38c6aba9-f3ad-475b-b0d2-3feb30073826 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.530578] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 706adf2f-f3a5-4c70-bdaa-a31911d3fda8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.542392] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 228a512d-ee17-440e-88e2-023af55853c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.552597] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3c0deb5b-4eeb-45fb-a248-187734fb8820 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.566147] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5fb62e8e-15de-4e18-a8e9-a5507e29a4bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.580254] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bed3e7fa-24a1-4399-82de-4a128d655376 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.592732] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e01f0362-0594-471f-a5f9-b444fb774606 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1060.593004] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1060.593584] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1060.710476] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Successfully created port: 65676593-bb14-439e-87af-75e2950f606b {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1061.031225] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9c30231-8696-48c8-99aa-9bba188bf187 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.042615] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1061.042615] nova-compute[62208]: warnings.warn( [ 1061.050015] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5efd2c62-0678-430f-8281-a769404bb646 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.052549] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1061.052549] nova-compute[62208]: warnings.warn( [ 1061.091499] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dd0f0fa-d671-48e0-9527-fdd06cc75401 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.094582] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1061.094582] nova-compute[62208]: warnings.warn( [ 1061.100464] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70742769-9c46-493f-bb5a-6de16d1ef4f1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.104280] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1061.104280] nova-compute[62208]: warnings.warn( [ 1061.114704] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1061.131228] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1061.151439] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1061.151744] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.840s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1061.323440] nova-compute[62208]: DEBUG nova.compute.manager [req-b0e08045-fbe7-4ca8-a561-7d6d7f4555ea req-1397f54c-8ba7-42ec-b6e4-bca2326568af service nova] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Received event network-vif-plugged-65676593-bb14-439e-87af-75e2950f606b {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1061.323789] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b0e08045-fbe7-4ca8-a561-7d6d7f4555ea req-1397f54c-8ba7-42ec-b6e4-bca2326568af service nova] Acquiring lock "d5c7531e-b496-4aed-be05-f1a96391e327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1061.324057] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b0e08045-fbe7-4ca8-a561-7d6d7f4555ea req-1397f54c-8ba7-42ec-b6e4-bca2326568af service nova] Lock "d5c7531e-b496-4aed-be05-f1a96391e327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1061.324220] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b0e08045-fbe7-4ca8-a561-7d6d7f4555ea req-1397f54c-8ba7-42ec-b6e4-bca2326568af service nova] Lock "d5c7531e-b496-4aed-be05-f1a96391e327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1061.324390] nova-compute[62208]: DEBUG nova.compute.manager [req-b0e08045-fbe7-4ca8-a561-7d6d7f4555ea req-1397f54c-8ba7-42ec-b6e4-bca2326568af service nova] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] No waiting events found dispatching network-vif-plugged-65676593-bb14-439e-87af-75e2950f606b {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1061.324647] nova-compute[62208]: WARNING nova.compute.manager [req-b0e08045-fbe7-4ca8-a561-7d6d7f4555ea req-1397f54c-8ba7-42ec-b6e4-bca2326568af service nova] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Received unexpected event network-vif-plugged-65676593-bb14-439e-87af-75e2950f606b for instance with vm_state building and task_state spawning. [ 1061.341050] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Successfully updated port: 65676593-bb14-439e-87af-75e2950f606b {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1061.352253] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "refresh_cache-d5c7531e-b496-4aed-be05-f1a96391e327" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1061.352399] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "refresh_cache-d5c7531e-b496-4aed-be05-f1a96391e327" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1061.352557] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1061.406131] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1061.596389] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Updating instance_info_cache with network_info: [{"id": "65676593-bb14-439e-87af-75e2950f606b", "address": "fa:16:3e:40:26:80", "network": {"id": "ce724fe2-66d9-4ed9-8fe0-d32189d49488", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1765832645-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "4fb2ff705fe34117b2dfb9354ae8cfc8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13af9422-d668-4413-b63a-766558d83a3b", "external-id": "nsx-vlan-transportzone-842", "segmentation_id": 842, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap65676593-bb", "ovs_interfaceid": "65676593-bb14-439e-87af-75e2950f606b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1061.611639] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "refresh_cache-d5c7531e-b496-4aed-be05-f1a96391e327" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1061.611962] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Instance network_info: |[{"id": "65676593-bb14-439e-87af-75e2950f606b", "address": "fa:16:3e:40:26:80", "network": {"id": "ce724fe2-66d9-4ed9-8fe0-d32189d49488", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1765832645-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "4fb2ff705fe34117b2dfb9354ae8cfc8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13af9422-d668-4413-b63a-766558d83a3b", "external-id": "nsx-vlan-transportzone-842", "segmentation_id": 842, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap65676593-bb", "ovs_interfaceid": "65676593-bb14-439e-87af-75e2950f606b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1061.612521] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:40:26:80', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '13af9422-d668-4413-b63a-766558d83a3b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '65676593-bb14-439e-87af-75e2950f606b', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1061.620185] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating folder: Project (4fb2ff705fe34117b2dfb9354ae8cfc8). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1061.620765] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f235e522-7b3c-48b5-b840-01c6c9bea391 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.622914] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1061.622914] nova-compute[62208]: warnings.warn( [ 1061.632999] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Created folder: Project (4fb2ff705fe34117b2dfb9354ae8cfc8) in parent group-v17427. [ 1061.633214] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating folder: Instances. Parent ref: group-v17510. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1061.633470] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b293076e-2ad9-4f5c-bf00-5bf3a7016d6c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.635161] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1061.635161] nova-compute[62208]: warnings.warn( [ 1061.644626] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Created folder: Instances in parent group-v17510. [ 1061.644781] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1061.645000] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1061.645188] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-855ae73f-7914-4a17-b8a9-a326941bdd9f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.660625] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1061.660625] nova-compute[62208]: warnings.warn( [ 1061.668054] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1061.668054] nova-compute[62208]: value = "task-38527" [ 1061.668054] nova-compute[62208]: _type = "Task" [ 1061.668054] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1061.671495] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1061.671495] nova-compute[62208]: warnings.warn( [ 1061.678610] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38527, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1062.147554] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1062.147876] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1062.148065] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1062.172065] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1062.172065] nova-compute[62208]: warnings.warn( [ 1062.178036] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38527, 'name': CreateVM_Task, 'duration_secs': 0.482618} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1062.178345] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1062.179012] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1062.179302] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1062.182294] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2724a154-9aa0-43b9-9e43-36082cbb48f7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.196290] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1062.196290] nova-compute[62208]: warnings.warn( [ 1062.219667] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Reconfiguring VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1062.220131] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-97ce4e84-8d77-4757-b1fb-f74f5726e309 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.231090] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1062.231090] nova-compute[62208]: warnings.warn( [ 1062.237488] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 1062.237488] nova-compute[62208]: value = "task-38528" [ 1062.237488] nova-compute[62208]: _type = "Task" [ 1062.237488] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1062.240791] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1062.240791] nova-compute[62208]: warnings.warn( [ 1062.247816] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38528, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1062.741895] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1062.741895] nova-compute[62208]: warnings.warn( [ 1062.748091] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38528, 'name': ReconfigVM_Task, 'duration_secs': 0.11079} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1062.748377] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Reconfigured VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1062.748634] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.569s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1062.748919] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1062.749202] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1062.749413] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1062.749669] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-939caa0c-0a7f-4573-93fd-9ae7f857bbf9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.751280] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1062.751280] nova-compute[62208]: warnings.warn( [ 1062.754808] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 1062.754808] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522a2e48-1b36-17c6-18c1-97881961985d" [ 1062.754808] nova-compute[62208]: _type = "Task" [ 1062.754808] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1062.758026] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1062.758026] nova-compute[62208]: warnings.warn( [ 1062.763375] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522a2e48-1b36-17c6-18c1-97881961985d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1063.259281] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1063.259281] nova-compute[62208]: warnings.warn( [ 1063.265456] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1063.265743] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1063.266006] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1063.351699] nova-compute[62208]: DEBUG nova.compute.manager [req-e48cf5a3-8059-48a3-a15e-1f20656ea2c1 req-dbe6a7bf-71d5-4fe7-a65a-d51d8e002d01 service nova] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Received event network-changed-65676593-bb14-439e-87af-75e2950f606b {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1063.352140] nova-compute[62208]: DEBUG nova.compute.manager [req-e48cf5a3-8059-48a3-a15e-1f20656ea2c1 req-dbe6a7bf-71d5-4fe7-a65a-d51d8e002d01 service nova] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Refreshing instance network info cache due to event network-changed-65676593-bb14-439e-87af-75e2950f606b. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1063.352380] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-e48cf5a3-8059-48a3-a15e-1f20656ea2c1 req-dbe6a7bf-71d5-4fe7-a65a-d51d8e002d01 service nova] Acquiring lock "refresh_cache-d5c7531e-b496-4aed-be05-f1a96391e327" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1063.352539] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-e48cf5a3-8059-48a3-a15e-1f20656ea2c1 req-dbe6a7bf-71d5-4fe7-a65a-d51d8e002d01 service nova] Acquired lock "refresh_cache-d5c7531e-b496-4aed-be05-f1a96391e327" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1063.352711] nova-compute[62208]: DEBUG nova.network.neutron [req-e48cf5a3-8059-48a3-a15e-1f20656ea2c1 req-dbe6a7bf-71d5-4fe7-a65a-d51d8e002d01 service nova] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Refreshing network info cache for port 65676593-bb14-439e-87af-75e2950f606b {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1063.659599] nova-compute[62208]: DEBUG nova.network.neutron [req-e48cf5a3-8059-48a3-a15e-1f20656ea2c1 req-dbe6a7bf-71d5-4fe7-a65a-d51d8e002d01 service nova] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Updated VIF entry in instance network info cache for port 65676593-bb14-439e-87af-75e2950f606b. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1063.660099] nova-compute[62208]: DEBUG nova.network.neutron [req-e48cf5a3-8059-48a3-a15e-1f20656ea2c1 req-dbe6a7bf-71d5-4fe7-a65a-d51d8e002d01 service nova] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Updating instance_info_cache with network_info: [{"id": "65676593-bb14-439e-87af-75e2950f606b", "address": "fa:16:3e:40:26:80", "network": {"id": "ce724fe2-66d9-4ed9-8fe0-d32189d49488", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1765832645-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "4fb2ff705fe34117b2dfb9354ae8cfc8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13af9422-d668-4413-b63a-766558d83a3b", "external-id": "nsx-vlan-transportzone-842", "segmentation_id": 842, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap65676593-bb", "ovs_interfaceid": "65676593-bb14-439e-87af-75e2950f606b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1063.670877] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-e48cf5a3-8059-48a3-a15e-1f20656ea2c1 req-dbe6a7bf-71d5-4fe7-a65a-d51d8e002d01 service nova] Releasing lock "refresh_cache-d5c7531e-b496-4aed-be05-f1a96391e327" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1064.863611] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f0e093e-dc6a-4e4e-a435-a65553eeae3d tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "d5c7531e-b496-4aed-be05-f1a96391e327" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1080.106229] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Acquiring lock "8b4b316e-5e6b-4455-bd0e-016cb89b9b9a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1080.106602] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Lock "8b4b316e-5e6b-4455-bd0e-016cb89b9b9a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1105.380997] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1105.380997] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1105.380997] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1105.380997] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1105.380997] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1105.380997] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1105.380997] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1105.380997] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1105.380997] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1105.380997] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1105.380997] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1105.380997] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1105.381620] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/efc24ef7-a759-4f62-9445-e4f09f155e3c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1105.383396] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1105.383641] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Copying Virtual Disk [datastore2] vmware_temp/efc24ef7-a759-4f62-9445-e4f09f155e3c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/efc24ef7-a759-4f62-9445-e4f09f155e3c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1105.384074] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cdedaade-69e2-4946-aaf5-207ccb0042eb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1105.387915] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1105.387915] nova-compute[62208]: warnings.warn( [ 1105.393888] nova-compute[62208]: DEBUG oslo_vmware.api [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Waiting for the task: (returnval){ [ 1105.393888] nova-compute[62208]: value = "task-38529" [ 1105.393888] nova-compute[62208]: _type = "Task" [ 1105.393888] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1105.396964] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1105.396964] nova-compute[62208]: warnings.warn( [ 1105.402263] nova-compute[62208]: DEBUG oslo_vmware.api [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Task: {'id': task-38529, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1105.898856] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1105.898856] nova-compute[62208]: warnings.warn( [ 1105.907529] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1105.908027] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1105.908921] nova-compute[62208]: ERROR nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1105.908921] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1105.908921] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Traceback (most recent call last): [ 1105.908921] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1105.908921] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] yield resources [ 1105.908921] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1105.908921] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] self.driver.spawn(context, instance, image_meta, [ 1105.908921] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1105.908921] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1105.908921] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1105.908921] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] self._fetch_image_if_missing(context, vi) [ 1105.908921] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1105.909352] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] image_cache(vi, tmp_image_ds_loc) [ 1105.909352] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1105.909352] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] vm_util.copy_virtual_disk( [ 1105.909352] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1105.909352] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] session._wait_for_task(vmdk_copy_task) [ 1105.909352] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1105.909352] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] return self.wait_for_task(task_ref) [ 1105.909352] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1105.909352] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] return evt.wait() [ 1105.909352] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1105.909352] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] result = hub.switch() [ 1105.909352] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1105.909352] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] return self.greenlet.switch() [ 1105.909770] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1105.909770] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] self.f(*self.args, **self.kw) [ 1105.909770] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1105.909770] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] raise exceptions.translate_fault(task_info.error) [ 1105.909770] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1105.909770] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Faults: ['InvalidArgument'] [ 1105.909770] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] [ 1105.909770] nova-compute[62208]: INFO nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Terminating instance [ 1105.911971] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1105.911971] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1105.912861] nova-compute[62208]: DEBUG nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1105.913166] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1105.913510] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8ae74d77-403a-4d7c-a3c0-15bd30e2731c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1105.917345] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95b61e02-6994-4934-872a-65964dbdea71 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1105.921357] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1105.921357] nova-compute[62208]: warnings.warn( [ 1105.921843] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1105.921843] nova-compute[62208]: warnings.warn( [ 1105.927613] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1105.929144] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e0487533-0bc6-4295-bebe-65ef4a616404 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1105.931353] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1105.931657] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1105.932707] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-05312d71-6558-4602-ae8b-b3b27c1c74b9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1105.935888] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1105.935888] nova-compute[62208]: warnings.warn( [ 1105.936343] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1105.936343] nova-compute[62208]: warnings.warn( [ 1105.939631] nova-compute[62208]: DEBUG oslo_vmware.api [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Waiting for the task: (returnval){ [ 1105.939631] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52852349-35ab-d7c2-27e0-592166e563a2" [ 1105.939631] nova-compute[62208]: _type = "Task" [ 1105.939631] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1105.944405] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1105.944405] nova-compute[62208]: warnings.warn( [ 1105.951200] nova-compute[62208]: DEBUG oslo_vmware.api [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52852349-35ab-d7c2-27e0-592166e563a2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1106.005129] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1106.005360] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1106.005546] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Deleting the datastore file [datastore2] d2839fc7-a754-4d4f-a7f1-e4b17f2126a5 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1106.005862] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b8630bb7-66e3-42c0-8bd4-a0555bc2fedb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.007882] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.007882] nova-compute[62208]: warnings.warn( [ 1106.013305] nova-compute[62208]: DEBUG oslo_vmware.api [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Waiting for the task: (returnval){ [ 1106.013305] nova-compute[62208]: value = "task-38531" [ 1106.013305] nova-compute[62208]: _type = "Task" [ 1106.013305] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1106.016325] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.016325] nova-compute[62208]: warnings.warn( [ 1106.021830] nova-compute[62208]: DEBUG oslo_vmware.api [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Task: {'id': task-38531, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1106.443559] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.443559] nova-compute[62208]: warnings.warn( [ 1106.450245] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1106.450511] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Creating directory with path [datastore2] vmware_temp/14d72830-b20b-4e0a-af5f-513ffda1e905/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1106.450755] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-35b81867-a9b4-4d12-b2f2-7876978d96d9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.452786] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.452786] nova-compute[62208]: warnings.warn( [ 1106.462353] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Created directory with path [datastore2] vmware_temp/14d72830-b20b-4e0a-af5f-513ffda1e905/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1106.462565] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Fetch image to [datastore2] vmware_temp/14d72830-b20b-4e0a-af5f-513ffda1e905/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1106.462735] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/14d72830-b20b-4e0a-af5f-513ffda1e905/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1106.463485] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-366ab4aa-646e-4f85-aa14-3cfb16bc4722 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.465810] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.465810] nova-compute[62208]: warnings.warn( [ 1106.470268] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6445cf3-bed6-42fd-8830-f87c08f2aaf4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.472466] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.472466] nova-compute[62208]: warnings.warn( [ 1106.480713] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ee0fd78-1725-43ee-9c48-d58306c4eaff {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.484272] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.484272] nova-compute[62208]: warnings.warn( [ 1106.511032] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8ee0086-ebb4-469a-82bb-974a61f4d7f5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.513392] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.513392] nova-compute[62208]: warnings.warn( [ 1106.518933] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.518933] nova-compute[62208]: warnings.warn( [ 1106.519445] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2c6bb18a-e75d-4b52-b131-88ac77691040 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.523803] nova-compute[62208]: DEBUG oslo_vmware.api [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Task: {'id': task-38531, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076516} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1106.523914] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.523914] nova-compute[62208]: warnings.warn( [ 1106.524469] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1106.524730] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1106.524922] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1106.525103] nova-compute[62208]: INFO nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1106.527257] nova-compute[62208]: DEBUG nova.compute.claims [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Aborting claim: <nova.compute.claims.Claim object at 0x7fb937726650> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1106.527463] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1106.527739] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1106.544610] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1106.599731] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/14d72830-b20b-4e0a-af5f-513ffda1e905/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1106.656965] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1106.657149] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/14d72830-b20b-4e0a-af5f-513ffda1e905/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1106.924078] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fedc9c4-8418-4362-80eb-79a846b3468b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.927146] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.927146] nova-compute[62208]: warnings.warn( [ 1106.932319] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba959517-667a-4064-b8c7-c172f6c15e3c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.935580] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.935580] nova-compute[62208]: warnings.warn( [ 1106.963014] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c943383a-fa0c-474a-b18b-17f3b8ec8a69 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.965461] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.965461] nova-compute[62208]: warnings.warn( [ 1106.971139] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee387c5e-8933-4b69-89e7-dff9672fb319 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.974914] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1106.974914] nova-compute[62208]: warnings.warn( [ 1106.984818] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1106.993512] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1107.010076] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.482s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.010624] nova-compute[62208]: ERROR nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1107.010624] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1107.010624] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Traceback (most recent call last): [ 1107.010624] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1107.010624] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] self.driver.spawn(context, instance, image_meta, [ 1107.010624] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1107.010624] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1107.010624] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1107.010624] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] self._fetch_image_if_missing(context, vi) [ 1107.010624] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1107.010624] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] image_cache(vi, tmp_image_ds_loc) [ 1107.010624] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1107.011031] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] vm_util.copy_virtual_disk( [ 1107.011031] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1107.011031] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] session._wait_for_task(vmdk_copy_task) [ 1107.011031] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1107.011031] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] return self.wait_for_task(task_ref) [ 1107.011031] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1107.011031] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] return evt.wait() [ 1107.011031] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1107.011031] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] result = hub.switch() [ 1107.011031] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1107.011031] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] return self.greenlet.switch() [ 1107.011031] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1107.011031] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] self.f(*self.args, **self.kw) [ 1107.011414] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1107.011414] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] raise exceptions.translate_fault(task_info.error) [ 1107.011414] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1107.011414] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Faults: ['InvalidArgument'] [ 1107.011414] nova-compute[62208]: ERROR nova.compute.manager [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] [ 1107.011414] nova-compute[62208]: DEBUG nova.compute.utils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1107.012762] nova-compute[62208]: DEBUG nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Build of instance d2839fc7-a754-4d4f-a7f1-e4b17f2126a5 was re-scheduled: A specified parameter was not correct: fileType [ 1107.012762] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1107.013185] nova-compute[62208]: DEBUG nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1107.013362] nova-compute[62208]: DEBUG nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1107.013533] nova-compute[62208]: DEBUG nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1107.013698] nova-compute[62208]: DEBUG nova.network.neutron [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1107.360454] nova-compute[62208]: DEBUG nova.network.neutron [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1107.373978] nova-compute[62208]: INFO nova.compute.manager [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Took 0.36 seconds to deallocate network for instance. [ 1107.474722] nova-compute[62208]: INFO nova.scheduler.client.report [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Deleted allocations for instance d2839fc7-a754-4d4f-a7f1-e4b17f2126a5 [ 1107.497066] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-43a89827-b28c-4ce3-bd61-8c099b1bdbfc tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Lock "d2839fc7-a754-4d4f-a7f1-e4b17f2126a5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 499.389s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.498233] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1d3f4201-5555-449e-90ea-47d7e7874cdd tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Lock "d2839fc7-a754-4d4f-a7f1-e4b17f2126a5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 301.993s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1107.500612] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1d3f4201-5555-449e-90ea-47d7e7874cdd tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Acquiring lock "d2839fc7-a754-4d4f-a7f1-e4b17f2126a5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1107.500612] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1d3f4201-5555-449e-90ea-47d7e7874cdd tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Lock "d2839fc7-a754-4d4f-a7f1-e4b17f2126a5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1107.500612] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1d3f4201-5555-449e-90ea-47d7e7874cdd tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Lock "d2839fc7-a754-4d4f-a7f1-e4b17f2126a5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.501605] nova-compute[62208]: INFO nova.compute.manager [None req-1d3f4201-5555-449e-90ea-47d7e7874cdd tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Terminating instance [ 1107.503615] nova-compute[62208]: DEBUG nova.compute.manager [None req-1d3f4201-5555-449e-90ea-47d7e7874cdd tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1107.503899] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-1d3f4201-5555-449e-90ea-47d7e7874cdd tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1107.504407] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-aa5802f4-ee7b-475c-88b4-b5ff8ee90478 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.506639] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1107.506639] nova-compute[62208]: warnings.warn( [ 1107.515335] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09f97c4c-b38f-4845-9d6e-1163e0dc866e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.526428] nova-compute[62208]: DEBUG nova.compute.manager [None req-88ff02c9-19db-4935-b3d7-6ef6f37cad7e tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] [instance: b03b2152-c17a-4757-b1ad-8e2fd1430fb3] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1107.529841] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1107.529841] nova-compute[62208]: warnings.warn( [ 1107.548859] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-1d3f4201-5555-449e-90ea-47d7e7874cdd tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d2839fc7-a754-4d4f-a7f1-e4b17f2126a5 could not be found. [ 1107.548964] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-1d3f4201-5555-449e-90ea-47d7e7874cdd tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1107.549165] nova-compute[62208]: INFO nova.compute.manager [None req-1d3f4201-5555-449e-90ea-47d7e7874cdd tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1107.549506] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-1d3f4201-5555-449e-90ea-47d7e7874cdd tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1107.549800] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1107.549917] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1107.553059] nova-compute[62208]: DEBUG nova.compute.manager [None req-88ff02c9-19db-4935-b3d7-6ef6f37cad7e tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] [instance: b03b2152-c17a-4757-b1ad-8e2fd1430fb3] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1107.579356] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1107.587928] nova-compute[62208]: INFO nova.compute.manager [-] [instance: d2839fc7-a754-4d4f-a7f1-e4b17f2126a5] Took 0.04 seconds to deallocate network for instance. [ 1107.594009] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-88ff02c9-19db-4935-b3d7-6ef6f37cad7e tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] Lock "b03b2152-c17a-4757-b1ad-8e2fd1430fb3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 229.792s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.606810] nova-compute[62208]: DEBUG nova.compute.manager [None req-afe46e49-843d-4946-a616-7292b5b12bdf tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] [instance: ab417ba8-5304-4dfd-a08e-102a43996d9f] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1107.639128] nova-compute[62208]: DEBUG nova.compute.manager [None req-afe46e49-843d-4946-a616-7292b5b12bdf tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] [instance: ab417ba8-5304-4dfd-a08e-102a43996d9f] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1107.664577] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-afe46e49-843d-4946-a616-7292b5b12bdf tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] Lock "ab417ba8-5304-4dfd-a08e-102a43996d9f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 229.204s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.677864] nova-compute[62208]: DEBUG nova.compute.manager [None req-523bc6ca-b005-4f1d-b6f4-68ce99a4391a tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] [instance: a2d62b77-29a3-4813-b4db-782e1aa52834] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1107.703458] nova-compute[62208]: DEBUG nova.compute.manager [None req-523bc6ca-b005-4f1d-b6f4-68ce99a4391a tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] [instance: a2d62b77-29a3-4813-b4db-782e1aa52834] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1107.711227] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1d3f4201-5555-449e-90ea-47d7e7874cdd tempest-TaggedAttachmentsTest-1196056706 tempest-TaggedAttachmentsTest-1196056706-project-member] Lock "d2839fc7-a754-4d4f-a7f1-e4b17f2126a5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.213s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.727145] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-523bc6ca-b005-4f1d-b6f4-68ce99a4391a tempest-ListServerFiltersTestJSON-156413940 tempest-ListServerFiltersTestJSON-156413940-project-member] Lock "a2d62b77-29a3-4813-b4db-782e1aa52834" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 228.621s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.740936] nova-compute[62208]: DEBUG nova.compute.manager [None req-587d14de-0f25-47c3-ab7d-1b870a46dad9 tempest-TaggedBootDevicesTest-1956918411 tempest-TaggedBootDevicesTest-1956918411-project-member] [instance: 3cd22f76-71d2-4d03-88c6-10192bb9418e] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1107.768039] nova-compute[62208]: DEBUG nova.compute.manager [None req-587d14de-0f25-47c3-ab7d-1b870a46dad9 tempest-TaggedBootDevicesTest-1956918411 tempest-TaggedBootDevicesTest-1956918411-project-member] [instance: 3cd22f76-71d2-4d03-88c6-10192bb9418e] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1107.792585] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-587d14de-0f25-47c3-ab7d-1b870a46dad9 tempest-TaggedBootDevicesTest-1956918411 tempest-TaggedBootDevicesTest-1956918411-project-member] Lock "3cd22f76-71d2-4d03-88c6-10192bb9418e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 226.869s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1107.804299] nova-compute[62208]: DEBUG nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1107.856937] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1107.857222] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1107.858727] nova-compute[62208]: INFO nova.compute.claims [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1108.184746] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb62c865-3916-4e3f-aef4-4d2851c3bcda {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.187254] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1108.187254] nova-compute[62208]: warnings.warn( [ 1108.192407] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fac155e-fb79-4ef8-9d44-ed09894d1247 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.195417] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1108.195417] nova-compute[62208]: warnings.warn( [ 1108.224040] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c26e3b02-dfe9-4d6d-9fa7-3879ae9d21ad {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.226705] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1108.226705] nova-compute[62208]: warnings.warn( [ 1108.232852] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7373db03-4a21-4a08-adb7-99b206ac7c8f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.237251] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1108.237251] nova-compute[62208]: warnings.warn( [ 1108.248932] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1108.259719] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1108.276310] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.419s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1108.276999] nova-compute[62208]: DEBUG nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1108.314642] nova-compute[62208]: DEBUG nova.compute.utils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1108.316418] nova-compute[62208]: DEBUG nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1108.316615] nova-compute[62208]: DEBUG nova.network.neutron [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1108.326924] nova-compute[62208]: DEBUG nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1108.382057] nova-compute[62208]: DEBUG nova.policy [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48436045b4bc4c67aea1ad9f6028f063', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cf73b75fb335481fa5601be64979220a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1108.400192] nova-compute[62208]: DEBUG nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1108.424054] nova-compute[62208]: DEBUG nova.virt.hardware [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1108.424332] nova-compute[62208]: DEBUG nova.virt.hardware [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1108.424500] nova-compute[62208]: DEBUG nova.virt.hardware [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1108.424685] nova-compute[62208]: DEBUG nova.virt.hardware [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1108.424834] nova-compute[62208]: DEBUG nova.virt.hardware [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1108.424985] nova-compute[62208]: DEBUG nova.virt.hardware [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1108.425192] nova-compute[62208]: DEBUG nova.virt.hardware [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1108.425353] nova-compute[62208]: DEBUG nova.virt.hardware [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1108.425519] nova-compute[62208]: DEBUG nova.virt.hardware [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1108.425683] nova-compute[62208]: DEBUG nova.virt.hardware [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1108.425854] nova-compute[62208]: DEBUG nova.virt.hardware [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1108.426882] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a92b3f12-38fc-41f5-b4b2-e5e6ceff22a5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.429465] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1108.429465] nova-compute[62208]: warnings.warn( [ 1108.435523] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c059def-9cfb-439e-b852-ca75a377f37b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.439240] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1108.439240] nova-compute[62208]: warnings.warn( [ 1108.713583] nova-compute[62208]: DEBUG nova.network.neutron [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Successfully created port: e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1109.256802] nova-compute[62208]: DEBUG nova.network.neutron [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Successfully updated port: e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1109.272214] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Acquiring lock "refresh_cache-d1a22d6e-d913-47de-9188-507d2475f745" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1109.272366] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Acquired lock "refresh_cache-d1a22d6e-d913-47de-9188-507d2475f745" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1109.272707] nova-compute[62208]: DEBUG nova.network.neutron [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1109.336875] nova-compute[62208]: DEBUG nova.network.neutron [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1109.459698] nova-compute[62208]: DEBUG nova.compute.manager [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Received event network-vif-plugged-e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1109.460092] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] Acquiring lock "d1a22d6e-d913-47de-9188-507d2475f745-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1109.460365] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] Lock "d1a22d6e-d913-47de-9188-507d2475f745-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1109.460607] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] Lock "d1a22d6e-d913-47de-9188-507d2475f745-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1109.460818] nova-compute[62208]: DEBUG nova.compute.manager [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] [instance: d1a22d6e-d913-47de-9188-507d2475f745] No waiting events found dispatching network-vif-plugged-e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1109.461044] nova-compute[62208]: WARNING nova.compute.manager [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Received unexpected event network-vif-plugged-e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a for instance with vm_state building and task_state spawning. [ 1109.461247] nova-compute[62208]: DEBUG nova.compute.manager [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Received event network-changed-e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1109.461442] nova-compute[62208]: DEBUG nova.compute.manager [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Refreshing instance network info cache due to event network-changed-e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1109.461645] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] Acquiring lock "refresh_cache-d1a22d6e-d913-47de-9188-507d2475f745" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1109.548151] nova-compute[62208]: DEBUG nova.network.neutron [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Updating instance_info_cache with network_info: [{"id": "e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a", "address": "fa:16:3e:69:75:29", "network": {"id": "890018c2-9050-43a2-ba34-96dd8d93ed49", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-166626053-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "cf73b75fb335481fa5601be64979220a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60bdba1a-14cf-46b2-9d8b-aeaf4d80c815", "external-id": "nsx-vlan-transportzone-922", "segmentation_id": 922, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape491bfb9-f6", "ovs_interfaceid": "e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1109.569396] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Releasing lock "refresh_cache-d1a22d6e-d913-47de-9188-507d2475f745" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1109.569720] nova-compute[62208]: DEBUG nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Instance network_info: |[{"id": "e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a", "address": "fa:16:3e:69:75:29", "network": {"id": "890018c2-9050-43a2-ba34-96dd8d93ed49", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-166626053-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "cf73b75fb335481fa5601be64979220a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60bdba1a-14cf-46b2-9d8b-aeaf4d80c815", "external-id": "nsx-vlan-transportzone-922", "segmentation_id": 922, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape491bfb9-f6", "ovs_interfaceid": "e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1109.570042] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] Acquired lock "refresh_cache-d1a22d6e-d913-47de-9188-507d2475f745" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1109.570221] nova-compute[62208]: DEBUG nova.network.neutron [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Refreshing network info cache for port e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1109.571890] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:69:75:29', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '60bdba1a-14cf-46b2-9d8b-aeaf4d80c815', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1109.580295] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Creating folder: Project (cf73b75fb335481fa5601be64979220a). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1109.584167] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c0fab623-7338-4e0b-bfc0-23c61fbcaf02 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.587012] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1109.587012] nova-compute[62208]: warnings.warn( [ 1109.597105] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Created folder: Project (cf73b75fb335481fa5601be64979220a) in parent group-v17427. [ 1109.597332] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Creating folder: Instances. Parent ref: group-v17513. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1109.597547] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3e65e67e-cc8d-4360-92ad-127f1c23a6df {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.599499] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1109.599499] nova-compute[62208]: warnings.warn( [ 1109.607511] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Created folder: Instances in parent group-v17513. [ 1109.607797] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1109.607995] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1109.608274] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0694ac37-a2c3-43c8-ae64-6a5bd8fe883d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.627041] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1109.627041] nova-compute[62208]: warnings.warn( [ 1109.632541] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1109.632541] nova-compute[62208]: value = "task-38534" [ 1109.632541] nova-compute[62208]: _type = "Task" [ 1109.632541] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1109.635611] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1109.635611] nova-compute[62208]: warnings.warn( [ 1109.642037] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38534, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1109.891015] nova-compute[62208]: DEBUG nova.network.neutron [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Updated VIF entry in instance network info cache for port e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1109.891392] nova-compute[62208]: DEBUG nova.network.neutron [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Updating instance_info_cache with network_info: [{"id": "e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a", "address": "fa:16:3e:69:75:29", "network": {"id": "890018c2-9050-43a2-ba34-96dd8d93ed49", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-166626053-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "cf73b75fb335481fa5601be64979220a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60bdba1a-14cf-46b2-9d8b-aeaf4d80c815", "external-id": "nsx-vlan-transportzone-922", "segmentation_id": 922, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape491bfb9-f6", "ovs_interfaceid": "e491bfb9-f6d6-4a7b-ad8d-638ac1602a9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1109.903877] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-46571761-59c2-479c-8820-ac637045f7c4 req-a3733c76-b4a0-4344-a8ed-0b47dc54e679 service nova] Releasing lock "refresh_cache-d1a22d6e-d913-47de-9188-507d2475f745" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1110.137009] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1110.137009] nova-compute[62208]: warnings.warn( [ 1110.143158] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38534, 'name': CreateVM_Task, 'duration_secs': 0.291797} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1110.143338] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1110.143957] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1110.144226] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1110.147033] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff8e2581-c389-460a-9011-511ce98b81ab {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1110.158463] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1110.158463] nova-compute[62208]: warnings.warn( [ 1110.187175] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Reconfiguring VM instance to enable vnc on port - 5909 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1110.187587] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-4867a83a-c4df-41fd-8574-96cdd3545bb3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1110.198101] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1110.198101] nova-compute[62208]: warnings.warn( [ 1110.204143] nova-compute[62208]: DEBUG oslo_vmware.api [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Waiting for the task: (returnval){ [ 1110.204143] nova-compute[62208]: value = "task-38535" [ 1110.204143] nova-compute[62208]: _type = "Task" [ 1110.204143] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1110.207238] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1110.207238] nova-compute[62208]: warnings.warn( [ 1110.212610] nova-compute[62208]: DEBUG oslo_vmware.api [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Task: {'id': task-38535, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1110.708910] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1110.708910] nova-compute[62208]: warnings.warn( [ 1110.719827] nova-compute[62208]: DEBUG oslo_vmware.api [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Task: {'id': task-38535, 'name': ReconfigVM_Task, 'duration_secs': 0.101324} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1110.719827] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Reconfigured VM instance to enable vnc on port - 5909 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1110.719827] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.573s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1110.719827] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1110.720121] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1110.720121] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1110.720121] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-40b3941b-fb4e-4d02-b34e-36ec5e8ac831 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1110.720121] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1110.720121] nova-compute[62208]: warnings.warn( [ 1110.722743] nova-compute[62208]: DEBUG oslo_vmware.api [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Waiting for the task: (returnval){ [ 1110.722743] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]525cc3ea-6a88-3115-ce15-d01b33232f19" [ 1110.722743] nova-compute[62208]: _type = "Task" [ 1110.722743] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1110.725863] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1110.725863] nova-compute[62208]: warnings.warn( [ 1110.730904] nova-compute[62208]: DEBUG oslo_vmware.api [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]525cc3ea-6a88-3115-ce15-d01b33232f19, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1111.227644] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1111.227644] nova-compute[62208]: warnings.warn( [ 1111.233834] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1111.234083] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1111.234292] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1115.466163] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-61f0aa73-4513-4a6d-83d8-cdcbbaf13411 tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Acquiring lock "d1a22d6e-d913-47de-9188-507d2475f745" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1116.140723] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1116.141071] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1116.141227] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11219}} [ 1116.152981] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] There are 0 instances to clean {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11228}} [ 1118.153198] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1118.153499] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1118.153499] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1118.174414] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1118.174578] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1118.174710] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1118.174839] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1118.174963] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1118.175084] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1118.175206] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1118.175323] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1118.175437] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1118.175549] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1118.175694] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1119.140963] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1120.136357] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1120.159505] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1120.169368] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1120.169599] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1120.169789] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1120.169915] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1120.171016] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc27b28b-b457-4f82-a1b2-868d1bf0d580 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1120.174026] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1120.174026] nova-compute[62208]: warnings.warn( [ 1120.180071] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c484875-3563-4634-a65d-39e3ce049a16 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1120.183636] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1120.183636] nova-compute[62208]: warnings.warn( [ 1120.195722] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3c58aa4-6854-4bb4-87a2-7500739f6dd5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1120.198071] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1120.198071] nova-compute[62208]: warnings.warn( [ 1120.202600] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d39aadc2-fcbe-4efc-bc0e-800049231fc5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1120.205415] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1120.205415] nova-compute[62208]: warnings.warn( [ 1120.231586] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181874MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1120.231755] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1120.231962] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1120.370099] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 08336643-4254-4447-b7c2-b81054bf9707 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1120.370266] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance af8885cb-afba-4724-be10-083e16f8bfc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1120.370401] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1120.370518] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1120.370639] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1120.370760] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1120.370883] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c4512476-9905-4f33-8575-d0a0f24ed4d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1120.370997] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1120.371109] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d5c7531e-b496-4aed-be05-f1a96391e327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1120.371222] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d1a22d6e-d913-47de-9188-507d2475f745 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1120.383008] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7f79eba6-e15c-4402-b46b-028d552a81d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1120.394130] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 79777c07-535b-43ca-9d49-7d595da14adc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1120.404668] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 6f86dbc6-cfa0-422f-8c80-93c47a7764df has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1120.415072] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8f74dd97-b43c-49ef-8a83-401329ebfbdb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1120.425247] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b21f3c77-3118-4229-96b4-7e242f0d5ad5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1120.435590] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 38c6aba9-f3ad-475b-b0d2-3feb30073826 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1120.445832] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 706adf2f-f3a5-4c70-bdaa-a31911d3fda8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1120.455857] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 228a512d-ee17-440e-88e2-023af55853c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1120.465865] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3c0deb5b-4eeb-45fb-a248-187734fb8820 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1120.476537] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5fb62e8e-15de-4e18-a8e9-a5507e29a4bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1120.487339] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bed3e7fa-24a1-4399-82de-4a128d655376 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1120.498590] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e01f0362-0594-471f-a5f9-b444fb774606 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1120.508660] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1120.508937] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1120.509122] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1120.525967] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing inventories for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1120.625723] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating ProviderTree inventory for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1120.625927] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating inventory in ProviderTree for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1120.637901] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing aggregate associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, aggregates: None {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1120.657196] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing trait associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1120.963062] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edb6fbdd-dd97-4e25-9822-944e27658d4d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1120.965703] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1120.965703] nova-compute[62208]: warnings.warn( [ 1120.971442] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed3c6384-9ee3-4bf0-bdaf-cfad66601c6f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1120.975086] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1120.975086] nova-compute[62208]: warnings.warn( [ 1121.003066] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f25d22ce-db3d-4bbb-a0b9-99ae9b7fde92 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.005812] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1121.005812] nova-compute[62208]: warnings.warn( [ 1121.011461] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86f0fc37-40a8-4b80-b6cd-b22f2b3a62d5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.015334] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1121.015334] nova-compute[62208]: warnings.warn( [ 1121.025013] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1121.033376] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1121.053854] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1121.054160] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.822s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1122.036328] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1122.036588] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1122.036643] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1122.036763] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1122.140999] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1123.136290] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1123.140913] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1123.141058] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances with incomplete migration {{(pid=62208) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11257}} [ 1123.149505] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1155.496440] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1155.496440] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1155.496440] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1155.496440] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1155.496440] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1155.496440] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1155.496440] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1155.496440] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1155.496440] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1155.496440] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1155.496440] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1155.496440] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1155.497090] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/14d72830-b20b-4e0a-af5f-513ffda1e905/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1155.498842] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1155.499132] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Copying Virtual Disk [datastore2] vmware_temp/14d72830-b20b-4e0a-af5f-513ffda1e905/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/14d72830-b20b-4e0a-af5f-513ffda1e905/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1155.499450] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-419f832e-57c1-49fe-bbd4-787c35eef0a3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1155.503352] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1155.503352] nova-compute[62208]: warnings.warn( [ 1155.509081] nova-compute[62208]: DEBUG oslo_vmware.api [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Waiting for the task: (returnval){ [ 1155.509081] nova-compute[62208]: value = "task-38536" [ 1155.509081] nova-compute[62208]: _type = "Task" [ 1155.509081] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1155.512247] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1155.512247] nova-compute[62208]: warnings.warn( [ 1155.517436] nova-compute[62208]: DEBUG oslo_vmware.api [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Task: {'id': task-38536, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1156.014223] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.014223] nova-compute[62208]: warnings.warn( [ 1156.020836] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1156.021168] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1156.021725] nova-compute[62208]: ERROR nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1156.021725] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1156.021725] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] Traceback (most recent call last): [ 1156.021725] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1156.021725] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] yield resources [ 1156.021725] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1156.021725] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] self.driver.spawn(context, instance, image_meta, [ 1156.021725] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1156.021725] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1156.021725] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1156.021725] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] self._fetch_image_if_missing(context, vi) [ 1156.021725] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1156.022132] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] image_cache(vi, tmp_image_ds_loc) [ 1156.022132] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1156.022132] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] vm_util.copy_virtual_disk( [ 1156.022132] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1156.022132] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] session._wait_for_task(vmdk_copy_task) [ 1156.022132] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1156.022132] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] return self.wait_for_task(task_ref) [ 1156.022132] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1156.022132] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] return evt.wait() [ 1156.022132] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1156.022132] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] result = hub.switch() [ 1156.022132] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1156.022132] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] return self.greenlet.switch() [ 1156.022531] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1156.022531] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] self.f(*self.args, **self.kw) [ 1156.022531] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1156.022531] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] raise exceptions.translate_fault(task_info.error) [ 1156.022531] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1156.022531] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] Faults: ['InvalidArgument'] [ 1156.022531] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] [ 1156.022531] nova-compute[62208]: INFO nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Terminating instance [ 1156.023650] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1156.023858] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1156.024121] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f14fe052-dfae-40aa-87e4-bbe67b9c4994 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.026737] nova-compute[62208]: DEBUG nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1156.026932] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1156.027684] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5536f4c2-2f6e-4128-8edd-3d1566255c96 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.030682] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.030682] nova-compute[62208]: warnings.warn( [ 1156.031343] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.031343] nova-compute[62208]: warnings.warn( [ 1156.035789] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1156.036063] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-56735916-7d42-4117-9817-d428ae6cd2fc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.038475] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1156.038696] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1156.039338] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.039338] nova-compute[62208]: warnings.warn( [ 1156.039799] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-46e0568d-654e-4b08-b3c5-f876ec83844e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.042496] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.042496] nova-compute[62208]: warnings.warn( [ 1156.050158] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Waiting for the task: (returnval){ [ 1156.050158] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52a1b77a-b5bd-cefe-5e05-657401ca3d9f" [ 1156.050158] nova-compute[62208]: _type = "Task" [ 1156.050158] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1156.053712] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.053712] nova-compute[62208]: warnings.warn( [ 1156.065098] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1156.065371] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Creating directory with path [datastore2] vmware_temp/773867e2-fdca-4901-b16f-3f03aec68b4d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1156.065616] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-68136fcb-2ba5-497d-ade5-7f559538ef3c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.067423] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.067423] nova-compute[62208]: warnings.warn( [ 1156.085731] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Created directory with path [datastore2] vmware_temp/773867e2-fdca-4901-b16f-3f03aec68b4d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1156.085966] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Fetch image to [datastore2] vmware_temp/773867e2-fdca-4901-b16f-3f03aec68b4d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1156.086191] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/773867e2-fdca-4901-b16f-3f03aec68b4d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1156.086940] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4d378e4-7e26-462d-9c74-e3a3ffa10e5d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.089978] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.089978] nova-compute[62208]: warnings.warn( [ 1156.094909] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36906741-f002-4cc0-9eb9-22b3177c6ea7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.097316] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.097316] nova-compute[62208]: warnings.warn( [ 1156.106725] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a3b68fc-098d-4ee2-96db-4a782e633fc3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.110693] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.110693] nova-compute[62208]: warnings.warn( [ 1156.115110] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1156.115378] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1156.115626] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Deleting the datastore file [datastore2] 08336643-4254-4447-b7c2-b81054bf9707 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1156.139943] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3293e8c7-1281-4001-8daa-de8cfbd94b99 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.142496] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5fcc9b0-2a40-4c8c-a1f6-b66e227e5cf7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.145445] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.145445] nova-compute[62208]: warnings.warn( [ 1156.145920] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.145920] nova-compute[62208]: warnings.warn( [ 1156.151232] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cd524c81-5b42-4e3b-8545-741d1908c9d1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.153371] nova-compute[62208]: DEBUG oslo_vmware.api [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Waiting for the task: (returnval){ [ 1156.153371] nova-compute[62208]: value = "task-38538" [ 1156.153371] nova-compute[62208]: _type = "Task" [ 1156.153371] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1156.153573] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.153573] nova-compute[62208]: warnings.warn( [ 1156.157014] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.157014] nova-compute[62208]: warnings.warn( [ 1156.162044] nova-compute[62208]: DEBUG oslo_vmware.api [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Task: {'id': task-38538, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1156.174639] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1156.227077] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/773867e2-fdca-4901-b16f-3f03aec68b4d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1156.282744] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1156.283001] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/773867e2-fdca-4901-b16f-3f03aec68b4d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1156.657435] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.657435] nova-compute[62208]: warnings.warn( [ 1156.664739] nova-compute[62208]: DEBUG oslo_vmware.api [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Task: {'id': task-38538, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070527} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1156.665023] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1156.665206] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1156.665370] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1156.665540] nova-compute[62208]: INFO nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1156.667609] nova-compute[62208]: DEBUG nova.compute.claims [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Aborting claim: <nova.compute.claims.Claim object at 0x7fb93695d7b0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1156.667812] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1156.668044] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1156.992029] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6baad302-3f83-4edc-b0d0-e6844b4d5408 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.994699] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1156.994699] nova-compute[62208]: warnings.warn( [ 1157.000350] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5118959a-3723-4674-92a7-66bcefbc81c6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.003352] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1157.003352] nova-compute[62208]: warnings.warn( [ 1157.030027] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12635f8d-770a-402f-b568-5d892639a8bc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.032532] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1157.032532] nova-compute[62208]: warnings.warn( [ 1157.038084] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-387120a9-fa6c-4ab4-97a2-b5d227ffd2e3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.041855] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1157.041855] nova-compute[62208]: warnings.warn( [ 1157.051965] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1157.065377] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1157.082831] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.415s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1157.083422] nova-compute[62208]: ERROR nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1157.083422] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1157.083422] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] Traceback (most recent call last): [ 1157.083422] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1157.083422] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] self.driver.spawn(context, instance, image_meta, [ 1157.083422] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1157.083422] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1157.083422] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1157.083422] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] self._fetch_image_if_missing(context, vi) [ 1157.083422] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1157.083422] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] image_cache(vi, tmp_image_ds_loc) [ 1157.083422] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1157.083844] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] vm_util.copy_virtual_disk( [ 1157.083844] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1157.083844] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] session._wait_for_task(vmdk_copy_task) [ 1157.083844] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1157.083844] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] return self.wait_for_task(task_ref) [ 1157.083844] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1157.083844] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] return evt.wait() [ 1157.083844] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1157.083844] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] result = hub.switch() [ 1157.083844] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1157.083844] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] return self.greenlet.switch() [ 1157.083844] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1157.083844] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] self.f(*self.args, **self.kw) [ 1157.084254] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1157.084254] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] raise exceptions.translate_fault(task_info.error) [ 1157.084254] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1157.084254] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] Faults: ['InvalidArgument'] [ 1157.084254] nova-compute[62208]: ERROR nova.compute.manager [instance: 08336643-4254-4447-b7c2-b81054bf9707] [ 1157.084254] nova-compute[62208]: DEBUG nova.compute.utils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1157.085628] nova-compute[62208]: DEBUG nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Build of instance 08336643-4254-4447-b7c2-b81054bf9707 was re-scheduled: A specified parameter was not correct: fileType [ 1157.085628] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1157.086034] nova-compute[62208]: DEBUG nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1157.086210] nova-compute[62208]: DEBUG nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1157.086379] nova-compute[62208]: DEBUG nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1157.086560] nova-compute[62208]: DEBUG nova.network.neutron [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1157.373947] nova-compute[62208]: DEBUG nova.network.neutron [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1157.388073] nova-compute[62208]: INFO nova.compute.manager [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Took 0.30 seconds to deallocate network for instance. [ 1157.486472] nova-compute[62208]: INFO nova.scheduler.client.report [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Deleted allocations for instance 08336643-4254-4447-b7c2-b81054bf9707 [ 1157.508069] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-109b1bb2-63bb-4b9f-8b9c-4da5e56ac9a3 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Lock "08336643-4254-4447-b7c2-b81054bf9707" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 548.090s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1157.509413] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ddb0b9b4-3582-4cdf-a343-0ff9de890198 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Lock "08336643-4254-4447-b7c2-b81054bf9707" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 351.862s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1157.509851] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ddb0b9b4-3582-4cdf-a343-0ff9de890198 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Acquiring lock "08336643-4254-4447-b7c2-b81054bf9707-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1157.509851] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ddb0b9b4-3582-4cdf-a343-0ff9de890198 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Lock "08336643-4254-4447-b7c2-b81054bf9707-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1157.510007] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ddb0b9b4-3582-4cdf-a343-0ff9de890198 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Lock "08336643-4254-4447-b7c2-b81054bf9707-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1157.512255] nova-compute[62208]: INFO nova.compute.manager [None req-ddb0b9b4-3582-4cdf-a343-0ff9de890198 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Terminating instance [ 1157.514405] nova-compute[62208]: DEBUG nova.compute.manager [None req-ddb0b9b4-3582-4cdf-a343-0ff9de890198 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1157.514591] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ddb0b9b4-3582-4cdf-a343-0ff9de890198 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1157.515082] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-57c44eac-bcc8-4f63-9994-a236bb739244 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.517510] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1157.517510] nova-compute[62208]: warnings.warn( [ 1157.525697] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7b1278b-9a60-4fb0-bbd6-9d1daf8bb360 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.537164] nova-compute[62208]: DEBUG nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1157.539787] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1157.539787] nova-compute[62208]: warnings.warn( [ 1157.558998] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-ddb0b9b4-3582-4cdf-a343-0ff9de890198 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 08336643-4254-4447-b7c2-b81054bf9707 could not be found. [ 1157.559209] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ddb0b9b4-3582-4cdf-a343-0ff9de890198 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1157.559390] nova-compute[62208]: INFO nova.compute.manager [None req-ddb0b9b4-3582-4cdf-a343-0ff9de890198 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1157.559641] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-ddb0b9b4-3582-4cdf-a343-0ff9de890198 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1157.559871] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1157.559969] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 08336643-4254-4447-b7c2-b81054bf9707] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1157.604502] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1157.618173] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 08336643-4254-4447-b7c2-b81054bf9707] Took 0.06 seconds to deallocate network for instance. [ 1157.630540] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1157.631152] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1157.632335] nova-compute[62208]: INFO nova.compute.claims [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1157.732206] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ddb0b9b4-3582-4cdf-a343-0ff9de890198 tempest-ServerActionsTestJSON-25147579 tempest-ServerActionsTestJSON-25147579-project-member] Lock "08336643-4254-4447-b7c2-b81054bf9707" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.223s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1157.963131] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13ebe885-527f-4260-935c-a06b31dd7290 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.965659] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1157.965659] nova-compute[62208]: warnings.warn( [ 1157.971241] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e397200c-8753-45a1-a74d-70ce4123c6ce {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.974394] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1157.974394] nova-compute[62208]: warnings.warn( [ 1158.003088] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32625bda-53ca-41e5-aeab-ff99e4a38cfb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.005414] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1158.005414] nova-compute[62208]: warnings.warn( [ 1158.011168] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af5d8e89-0422-411b-bec6-dd8a3d262a41 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.014978] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1158.014978] nova-compute[62208]: warnings.warn( [ 1158.025026] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1158.035463] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1158.052384] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.421s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1158.052825] nova-compute[62208]: DEBUG nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1158.094559] nova-compute[62208]: DEBUG nova.compute.utils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1158.095829] nova-compute[62208]: DEBUG nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1158.096037] nova-compute[62208]: DEBUG nova.network.neutron [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1158.112248] nova-compute[62208]: DEBUG nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1158.190870] nova-compute[62208]: DEBUG nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1158.213075] nova-compute[62208]: DEBUG nova.virt.hardware [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1158.213373] nova-compute[62208]: DEBUG nova.virt.hardware [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1158.213574] nova-compute[62208]: DEBUG nova.virt.hardware [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1158.213769] nova-compute[62208]: DEBUG nova.virt.hardware [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1158.213935] nova-compute[62208]: DEBUG nova.virt.hardware [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1158.214143] nova-compute[62208]: DEBUG nova.virt.hardware [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1158.214390] nova-compute[62208]: DEBUG nova.virt.hardware [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1158.214565] nova-compute[62208]: DEBUG nova.virt.hardware [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1158.214741] nova-compute[62208]: DEBUG nova.virt.hardware [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1158.214907] nova-compute[62208]: DEBUG nova.virt.hardware [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1158.215078] nova-compute[62208]: DEBUG nova.virt.hardware [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1158.215976] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c93f39fb-d2dc-4fce-b3dd-ddfa14c27bc5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.218484] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1158.218484] nova-compute[62208]: warnings.warn( [ 1158.224687] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d73bbbd-87bc-4e31-94d7-23fefe61d0cb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.228710] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1158.228710] nova-compute[62208]: warnings.warn( [ 1158.327233] nova-compute[62208]: DEBUG nova.policy [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bd5497a94524d2d97f74b1fbaedd7f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a9b50d4a3e0c43d491d13e85d9a2bb8a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1158.680121] nova-compute[62208]: DEBUG nova.network.neutron [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Successfully created port: a27e1527-9130-49a5-be27-029291331c78 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1159.431849] nova-compute[62208]: DEBUG nova.compute.manager [req-e1a87606-027b-456b-931c-7a5950339824 req-e62ebb9d-1cbc-421e-a6e5-b6a126ee5c54 service nova] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Received event network-vif-plugged-a27e1527-9130-49a5-be27-029291331c78 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1159.432163] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-e1a87606-027b-456b-931c-7a5950339824 req-e62ebb9d-1cbc-421e-a6e5-b6a126ee5c54 service nova] Acquiring lock "7f79eba6-e15c-4402-b46b-028d552a81d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.432316] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-e1a87606-027b-456b-931c-7a5950339824 req-e62ebb9d-1cbc-421e-a6e5-b6a126ee5c54 service nova] Lock "7f79eba6-e15c-4402-b46b-028d552a81d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1159.432482] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-e1a87606-027b-456b-931c-7a5950339824 req-e62ebb9d-1cbc-421e-a6e5-b6a126ee5c54 service nova] Lock "7f79eba6-e15c-4402-b46b-028d552a81d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1159.432647] nova-compute[62208]: DEBUG nova.compute.manager [req-e1a87606-027b-456b-931c-7a5950339824 req-e62ebb9d-1cbc-421e-a6e5-b6a126ee5c54 service nova] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] No waiting events found dispatching network-vif-plugged-a27e1527-9130-49a5-be27-029291331c78 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1159.432904] nova-compute[62208]: WARNING nova.compute.manager [req-e1a87606-027b-456b-931c-7a5950339824 req-e62ebb9d-1cbc-421e-a6e5-b6a126ee5c54 service nova] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Received unexpected event network-vif-plugged-a27e1527-9130-49a5-be27-029291331c78 for instance with vm_state building and task_state spawning. [ 1159.510985] nova-compute[62208]: DEBUG nova.network.neutron [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Successfully updated port: a27e1527-9130-49a5-be27-029291331c78 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1159.522273] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "refresh_cache-7f79eba6-e15c-4402-b46b-028d552a81d4" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1159.522414] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquired lock "refresh_cache-7f79eba6-e15c-4402-b46b-028d552a81d4" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1159.522567] nova-compute[62208]: DEBUG nova.network.neutron [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1159.563377] nova-compute[62208]: DEBUG nova.network.neutron [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1159.751057] nova-compute[62208]: DEBUG nova.network.neutron [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Updating instance_info_cache with network_info: [{"id": "a27e1527-9130-49a5-be27-029291331c78", "address": "fa:16:3e:f4:73:a8", "network": {"id": "36e2d180-28fc-4e4c-90c8-e71996738298", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1469032612-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a9b50d4a3e0c43d491d13e85d9a2bb8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "459b8c74-0aa6-42b6-996a-42b1c5d7e5c6", "external-id": "nsx-vlan-transportzone-467", "segmentation_id": 467, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa27e1527-91", "ovs_interfaceid": "a27e1527-9130-49a5-be27-029291331c78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1159.766313] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Releasing lock "refresh_cache-7f79eba6-e15c-4402-b46b-028d552a81d4" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1159.766647] nova-compute[62208]: DEBUG nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Instance network_info: |[{"id": "a27e1527-9130-49a5-be27-029291331c78", "address": "fa:16:3e:f4:73:a8", "network": {"id": "36e2d180-28fc-4e4c-90c8-e71996738298", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1469032612-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a9b50d4a3e0c43d491d13e85d9a2bb8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "459b8c74-0aa6-42b6-996a-42b1c5d7e5c6", "external-id": "nsx-vlan-transportzone-467", "segmentation_id": 467, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa27e1527-91", "ovs_interfaceid": "a27e1527-9130-49a5-be27-029291331c78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1159.767399] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f4:73:a8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '459b8c74-0aa6-42b6-996a-42b1c5d7e5c6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a27e1527-9130-49a5-be27-029291331c78', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1159.775184] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Creating folder: Project (a9b50d4a3e0c43d491d13e85d9a2bb8a). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1159.776088] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-df88c7f1-0e5b-4bbd-9e5a-0bc8bf3a2427 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.778336] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1159.778336] nova-compute[62208]: warnings.warn( [ 1159.788032] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Created folder: Project (a9b50d4a3e0c43d491d13e85d9a2bb8a) in parent group-v17427. [ 1159.788032] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Creating folder: Instances. Parent ref: group-v17516. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1159.788707] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-25be1a21-6e65-4c98-b8ca-a59e84c04a86 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.790448] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1159.790448] nova-compute[62208]: warnings.warn( [ 1159.799814] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Created folder: Instances in parent group-v17516. [ 1159.800465] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1159.800841] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1159.801087] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8e144bbe-8416-41e7-ba7e-bf4994263fec {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.817080] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1159.817080] nova-compute[62208]: warnings.warn( [ 1159.823057] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1159.823057] nova-compute[62208]: value = "task-38541" [ 1159.823057] nova-compute[62208]: _type = "Task" [ 1159.823057] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1159.826072] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1159.826072] nova-compute[62208]: warnings.warn( [ 1159.831483] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38541, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1160.327171] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1160.327171] nova-compute[62208]: warnings.warn( [ 1160.333869] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38541, 'name': CreateVM_Task, 'duration_secs': 0.322626} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1160.334037] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1160.334658] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1160.334890] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1160.337999] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bec07fb-2d8c-4239-bd5b-5a0ff46c9ca5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.350378] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1160.350378] nova-compute[62208]: warnings.warn( [ 1160.373761] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Reconfiguring VM instance to enable vnc on port - 5902 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1160.374162] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-1c663ed0-dbc5-4f03-858f-cf09ddb2c732 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.384840] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1160.384840] nova-compute[62208]: warnings.warn( [ 1160.391980] nova-compute[62208]: DEBUG oslo_vmware.api [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 1160.391980] nova-compute[62208]: value = "task-38542" [ 1160.391980] nova-compute[62208]: _type = "Task" [ 1160.391980] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1160.395246] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1160.395246] nova-compute[62208]: warnings.warn( [ 1160.401819] nova-compute[62208]: DEBUG oslo_vmware.api [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38542, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1160.896507] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1160.896507] nova-compute[62208]: warnings.warn( [ 1160.902951] nova-compute[62208]: DEBUG oslo_vmware.api [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38542, 'name': ReconfigVM_Task, 'duration_secs': 0.118346} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1160.903298] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Reconfigured VM instance to enable vnc on port - 5902 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1160.903516] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.569s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1160.903763] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1160.903903] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1160.904284] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1160.904537] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-31945a7d-2aaa-4626-a396-84b7745c72b5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.906257] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1160.906257] nova-compute[62208]: warnings.warn( [ 1160.909992] nova-compute[62208]: DEBUG oslo_vmware.api [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 1160.909992] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]527c5304-d248-6fec-7aec-a9d76d5b183d" [ 1160.909992] nova-compute[62208]: _type = "Task" [ 1160.909992] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1160.913260] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1160.913260] nova-compute[62208]: warnings.warn( [ 1160.919603] nova-compute[62208]: DEBUG oslo_vmware.api [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]527c5304-d248-6fec-7aec-a9d76d5b183d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1161.414522] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1161.414522] nova-compute[62208]: warnings.warn( [ 1161.422040] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1161.422310] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1161.422526] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1161.462024] nova-compute[62208]: DEBUG nova.compute.manager [req-faf61332-6043-499c-8b36-200c3aafc2cd req-7eb3cce6-5571-4610-b18d-d4cd7f0fc674 service nova] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Received event network-changed-a27e1527-9130-49a5-be27-029291331c78 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1161.462024] nova-compute[62208]: DEBUG nova.compute.manager [req-faf61332-6043-499c-8b36-200c3aafc2cd req-7eb3cce6-5571-4610-b18d-d4cd7f0fc674 service nova] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Refreshing instance network info cache due to event network-changed-a27e1527-9130-49a5-be27-029291331c78. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1161.462024] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-faf61332-6043-499c-8b36-200c3aafc2cd req-7eb3cce6-5571-4610-b18d-d4cd7f0fc674 service nova] Acquiring lock "refresh_cache-7f79eba6-e15c-4402-b46b-028d552a81d4" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1161.462024] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-faf61332-6043-499c-8b36-200c3aafc2cd req-7eb3cce6-5571-4610-b18d-d4cd7f0fc674 service nova] Acquired lock "refresh_cache-7f79eba6-e15c-4402-b46b-028d552a81d4" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1161.462177] nova-compute[62208]: DEBUG nova.network.neutron [req-faf61332-6043-499c-8b36-200c3aafc2cd req-7eb3cce6-5571-4610-b18d-d4cd7f0fc674 service nova] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Refreshing network info cache for port a27e1527-9130-49a5-be27-029291331c78 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1161.736456] nova-compute[62208]: DEBUG nova.network.neutron [req-faf61332-6043-499c-8b36-200c3aafc2cd req-7eb3cce6-5571-4610-b18d-d4cd7f0fc674 service nova] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Updated VIF entry in instance network info cache for port a27e1527-9130-49a5-be27-029291331c78. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1161.736988] nova-compute[62208]: DEBUG nova.network.neutron [req-faf61332-6043-499c-8b36-200c3aafc2cd req-7eb3cce6-5571-4610-b18d-d4cd7f0fc674 service nova] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Updating instance_info_cache with network_info: [{"id": "a27e1527-9130-49a5-be27-029291331c78", "address": "fa:16:3e:f4:73:a8", "network": {"id": "36e2d180-28fc-4e4c-90c8-e71996738298", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1469032612-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a9b50d4a3e0c43d491d13e85d9a2bb8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "459b8c74-0aa6-42b6-996a-42b1c5d7e5c6", "external-id": "nsx-vlan-transportzone-467", "segmentation_id": 467, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa27e1527-91", "ovs_interfaceid": "a27e1527-9130-49a5-be27-029291331c78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1161.748082] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-faf61332-6043-499c-8b36-200c3aafc2cd req-7eb3cce6-5571-4610-b18d-d4cd7f0fc674 service nova] Releasing lock "refresh_cache-7f79eba6-e15c-4402-b46b-028d552a81d4" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1162.697918] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-232d2cf2-b0bc-45c1-9503-fbf2043dd983 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "7f79eba6-e15c-4402-b46b-028d552a81d4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1168.145420] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "c0d7e5a6-e905-47ee-87d7-cda8543be1f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1168.145806] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "c0d7e5a6-e905-47ee-87d7-cda8543be1f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1171.311053] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Acquiring lock "4d29dc3e-1090-49fd-83b7-96b8e6855ede" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1171.311346] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Lock "4d29dc3e-1090-49fd-83b7-96b8e6855ede" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1174.375109] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_power_states {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1174.398148] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Getting list of instances from cluster (obj){ [ 1174.398148] nova-compute[62208]: value = "domain-c8" [ 1174.398148] nova-compute[62208]: _type = "ClusterComputeResource" [ 1174.398148] nova-compute[62208]: } {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1174.399471] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c784325b-acf1-4a45-b14e-7f2c33851795 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.403372] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1174.403372] nova-compute[62208]: warnings.warn( [ 1174.419409] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Got total of 10 instances {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1174.419601] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid af8885cb-afba-4724-be10-083e16f8bfc4 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1174.419800] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1174.419959] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1174.420165] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1174.420324] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1174.420477] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid c4512476-9905-4f33-8575-d0a0f24ed4d5 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1174.420626] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 17dbfb9d-4ec1-4937-8bb7-343101f8f61b {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1174.420775] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid d5c7531e-b496-4aed-be05-f1a96391e327 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1174.420922] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid d1a22d6e-d913-47de-9188-507d2475f745 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1174.421067] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 7f79eba6-e15c-4402-b46b-028d552a81d4 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1174.421427] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "af8885cb-afba-4724-be10-083e16f8bfc4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.421664] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.421860] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.422053] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "9b3e4e69-4f4c-48f3-957e-0cfd979b5712" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.422240] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.422431] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "c4512476-9905-4f33-8575-d0a0f24ed4d5" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.422623] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "17dbfb9d-4ec1-4937-8bb7-343101f8f61b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.422812] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "d5c7531e-b496-4aed-be05-f1a96391e327" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.423007] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "d1a22d6e-d913-47de-9188-507d2475f745" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.423198] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "7f79eba6-e15c-4402-b46b-028d552a81d4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1176.188786] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1180.140803] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1180.141138] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1180.141138] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1180.161014] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1180.161190] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1180.161330] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1180.161457] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1180.161579] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1180.161700] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1180.161820] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1180.161940] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1180.162058] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1180.162175] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1180.162294] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1181.140440] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1181.140674] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1181.140837] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1182.141587] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1182.151940] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1182.152087] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1182.152267] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1182.152443] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1182.153561] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f56de045-6994-4a6b-b6f1-df808ffbf1db {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1182.156689] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1182.156689] nova-compute[62208]: warnings.warn( [ 1182.162896] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e13b0f7-a15b-4fc1-8deb-bf0fbd6149c6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1182.166872] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1182.166872] nova-compute[62208]: warnings.warn( [ 1182.178225] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-def908db-2c71-4a20-ae5d-c239ae50d31a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1182.180931] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1182.180931] nova-compute[62208]: warnings.warn( [ 1182.185443] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c8de913-ec70-4831-952f-25ff4b460c24 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1182.188534] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1182.188534] nova-compute[62208]: warnings.warn( [ 1182.214617] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181911MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1182.214778] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1182.214980] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1182.288545] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance af8885cb-afba-4724-be10-083e16f8bfc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1182.288721] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1182.288865] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1182.289000] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1182.289123] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1182.289245] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c4512476-9905-4f33-8575-d0a0f24ed4d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1182.289365] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1182.289487] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d5c7531e-b496-4aed-be05-f1a96391e327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1182.289598] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d1a22d6e-d913-47de-9188-507d2475f745 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1182.289713] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7f79eba6-e15c-4402-b46b-028d552a81d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1182.303489] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 6f86dbc6-cfa0-422f-8c80-93c47a7764df has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1182.322031] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8f74dd97-b43c-49ef-8a83-401329ebfbdb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1182.333438] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b21f3c77-3118-4229-96b4-7e242f0d5ad5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1182.343516] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 38c6aba9-f3ad-475b-b0d2-3feb30073826 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1182.354580] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 706adf2f-f3a5-4c70-bdaa-a31911d3fda8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1182.368238] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 228a512d-ee17-440e-88e2-023af55853c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1182.379268] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3c0deb5b-4eeb-45fb-a248-187734fb8820 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1182.391647] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5fb62e8e-15de-4e18-a8e9-a5507e29a4bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1182.402624] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bed3e7fa-24a1-4399-82de-4a128d655376 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1182.416817] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e01f0362-0594-471f-a5f9-b444fb774606 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1182.429455] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1182.440452] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c0d7e5a6-e905-47ee-87d7-cda8543be1f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1182.453666] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4d29dc3e-1090-49fd-83b7-96b8e6855ede has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1182.453921] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1182.454072] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1182.740468] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-653b0fda-9ac4-4e9d-9304-334c107ba049 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1182.742915] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1182.742915] nova-compute[62208]: warnings.warn( [ 1182.747862] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fc3f3de-be4a-4c9f-b110-abaf72704ccd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1182.751025] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1182.751025] nova-compute[62208]: warnings.warn( [ 1182.777118] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70521f41-5b3f-4126-bdd7-01873e6e99cd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1182.779575] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1182.779575] nova-compute[62208]: warnings.warn( [ 1182.785091] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49fd83ab-6c9e-4fa2-9425-d0c6e8a4a4a5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1182.788855] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1182.788855] nova-compute[62208]: warnings.warn( [ 1182.798406] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1182.812781] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1182.831872] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1182.831872] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.616s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1183.825989] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1183.826313] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1183.826403] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1183.826547] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1203.136161] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "e45ad927-7d07-43d5-84b8-339c68981de6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1203.136950] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "e45ad927-7d07-43d5-84b8-339c68981de6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1205.511777] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1205.511777] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1205.511777] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1205.511777] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1205.511777] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1205.511777] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1205.511777] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1205.511777] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1205.511777] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1205.511777] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1205.511777] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1205.511777] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1205.512397] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/773867e2-fdca-4901-b16f-3f03aec68b4d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1205.513983] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1205.514255] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Copying Virtual Disk [datastore2] vmware_temp/773867e2-fdca-4901-b16f-3f03aec68b4d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/773867e2-fdca-4901-b16f-3f03aec68b4d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1205.514510] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6382edcb-bf17-4077-985b-2a7e3360d537 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1205.517174] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1205.517174] nova-compute[62208]: warnings.warn( [ 1205.523737] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Waiting for the task: (returnval){ [ 1205.523737] nova-compute[62208]: value = "task-38543" [ 1205.523737] nova-compute[62208]: _type = "Task" [ 1205.523737] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1205.527310] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1205.527310] nova-compute[62208]: warnings.warn( [ 1205.532818] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Task: {'id': task-38543, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1206.027707] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.027707] nova-compute[62208]: warnings.warn( [ 1206.035606] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1206.035910] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1206.036520] nova-compute[62208]: ERROR nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1206.036520] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1206.036520] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Traceback (most recent call last): [ 1206.036520] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1206.036520] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] yield resources [ 1206.036520] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1206.036520] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] self.driver.spawn(context, instance, image_meta, [ 1206.036520] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1206.036520] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1206.036520] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1206.036520] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] self._fetch_image_if_missing(context, vi) [ 1206.036520] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1206.037000] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] image_cache(vi, tmp_image_ds_loc) [ 1206.037000] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1206.037000] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] vm_util.copy_virtual_disk( [ 1206.037000] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1206.037000] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] session._wait_for_task(vmdk_copy_task) [ 1206.037000] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1206.037000] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] return self.wait_for_task(task_ref) [ 1206.037000] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1206.037000] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] return evt.wait() [ 1206.037000] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1206.037000] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] result = hub.switch() [ 1206.037000] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1206.037000] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] return self.greenlet.switch() [ 1206.037497] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1206.037497] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] self.f(*self.args, **self.kw) [ 1206.037497] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1206.037497] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] raise exceptions.translate_fault(task_info.error) [ 1206.037497] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1206.037497] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Faults: ['InvalidArgument'] [ 1206.037497] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] [ 1206.037497] nova-compute[62208]: INFO nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Terminating instance [ 1206.039599] nova-compute[62208]: DEBUG nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1206.039870] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1206.040211] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1206.040447] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1206.041642] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25195253-02e2-480a-b3c7-2ed7e5bbbc05 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.044753] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-87524503-0f71-4b3a-9956-e96174493a6b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.046469] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.046469] nova-compute[62208]: warnings.warn( [ 1206.046857] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.046857] nova-compute[62208]: warnings.warn( [ 1206.051919] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1206.052223] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-43556401-ae88-4a47-ac5b-a7f20a89871d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.054631] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1206.054806] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1206.055383] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.055383] nova-compute[62208]: warnings.warn( [ 1206.055824] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2548a55e-02da-4c84-a3a0-a092d6fb880d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.057854] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.057854] nova-compute[62208]: warnings.warn( [ 1206.060946] nova-compute[62208]: DEBUG oslo_vmware.api [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Waiting for the task: (returnval){ [ 1206.060946] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5203befc-3ea5-1e51-0263-6a9af6c1c89f" [ 1206.060946] nova-compute[62208]: _type = "Task" [ 1206.060946] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1206.063918] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.063918] nova-compute[62208]: warnings.warn( [ 1206.068776] nova-compute[62208]: DEBUG oslo_vmware.api [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5203befc-3ea5-1e51-0263-6a9af6c1c89f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1206.132615] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1206.132847] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1206.133028] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Deleting the datastore file [datastore2] af8885cb-afba-4724-be10-083e16f8bfc4 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1206.133310] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-534f9aa0-d220-41ef-a2c8-96f3f469d2df {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.135224] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.135224] nova-compute[62208]: warnings.warn( [ 1206.141574] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Waiting for the task: (returnval){ [ 1206.141574] nova-compute[62208]: value = "task-38545" [ 1206.141574] nova-compute[62208]: _type = "Task" [ 1206.141574] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1206.144883] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.144883] nova-compute[62208]: warnings.warn( [ 1206.150536] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Task: {'id': task-38545, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1206.565587] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.565587] nova-compute[62208]: warnings.warn( [ 1206.572100] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1206.572456] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Creating directory with path [datastore2] vmware_temp/bb3d6f9c-41ae-4561-8814-0664178b815d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1206.572796] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a244dc2a-c30e-4c05-a5ef-c2df297613c0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.574929] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.574929] nova-compute[62208]: warnings.warn( [ 1206.587274] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Created directory with path [datastore2] vmware_temp/bb3d6f9c-41ae-4561-8814-0664178b815d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1206.587677] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Fetch image to [datastore2] vmware_temp/bb3d6f9c-41ae-4561-8814-0664178b815d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1206.588071] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/bb3d6f9c-41ae-4561-8814-0664178b815d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1206.588931] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eae2f4e-3e21-4fc7-88cb-1e634db56032 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.591531] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.591531] nova-compute[62208]: warnings.warn( [ 1206.596896] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d37ef97d-5c27-4b9e-ae7e-5c2fe5351cae {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.599355] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.599355] nova-compute[62208]: warnings.warn( [ 1206.606841] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea52d98a-6121-4dfe-8ed9-0a99319f08dd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.610532] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.610532] nova-compute[62208]: warnings.warn( [ 1206.642152] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d98b23a5-c43c-4b17-b745-ef2f77ab57c8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.648669] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.648669] nova-compute[62208]: warnings.warn( [ 1206.649138] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.649138] nova-compute[62208]: warnings.warn( [ 1206.658528] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5a9d19ce-4342-400e-b03b-2a3c2da16b27 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1206.660396] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Task: {'id': task-38545, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076605} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1206.660645] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1206.660825] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1206.660995] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1206.661168] nova-compute[62208]: INFO nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1206.662642] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1206.662642] nova-compute[62208]: warnings.warn( [ 1206.663363] nova-compute[62208]: DEBUG nova.compute.claims [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Aborting claim: <nova.compute.claims.Claim object at 0x7fb93711e950> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1206.663544] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1206.663771] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1206.682535] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1206.801974] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bb3d6f9c-41ae-4561-8814-0664178b815d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1206.860446] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1206.860625] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bb3d6f9c-41ae-4561-8814-0664178b815d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1207.070175] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78aa4b2e-3cf9-4bfb-86b7-7e371fb7ff70 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.073324] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1207.073324] nova-compute[62208]: warnings.warn( [ 1207.080143] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52566a07-94c9-48d4-b06d-18651cbd8f0e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.083136] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1207.083136] nova-compute[62208]: warnings.warn( [ 1207.113266] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62377e00-6fe9-4cd0-b4df-3c6ead927125 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.116453] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1207.116453] nova-compute[62208]: warnings.warn( [ 1207.122104] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a47fb7e7-d83d-4ba2-bab8-1c2131d4b047 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.126224] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1207.126224] nova-compute[62208]: warnings.warn( [ 1207.137840] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1207.146759] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1207.166234] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.502s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.166894] nova-compute[62208]: ERROR nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1207.166894] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1207.166894] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Traceback (most recent call last): [ 1207.166894] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1207.166894] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] self.driver.spawn(context, instance, image_meta, [ 1207.166894] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1207.166894] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1207.166894] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1207.166894] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] self._fetch_image_if_missing(context, vi) [ 1207.166894] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1207.166894] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] image_cache(vi, tmp_image_ds_loc) [ 1207.166894] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1207.167278] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] vm_util.copy_virtual_disk( [ 1207.167278] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1207.167278] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] session._wait_for_task(vmdk_copy_task) [ 1207.167278] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1207.167278] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] return self.wait_for_task(task_ref) [ 1207.167278] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1207.167278] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] return evt.wait() [ 1207.167278] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1207.167278] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] result = hub.switch() [ 1207.167278] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1207.167278] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] return self.greenlet.switch() [ 1207.167278] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1207.167278] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] self.f(*self.args, **self.kw) [ 1207.167665] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1207.167665] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] raise exceptions.translate_fault(task_info.error) [ 1207.167665] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1207.167665] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Faults: ['InvalidArgument'] [ 1207.167665] nova-compute[62208]: ERROR nova.compute.manager [instance: af8885cb-afba-4724-be10-083e16f8bfc4] [ 1207.167665] nova-compute[62208]: DEBUG nova.compute.utils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1207.169994] nova-compute[62208]: DEBUG nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Build of instance af8885cb-afba-4724-be10-083e16f8bfc4 was re-scheduled: A specified parameter was not correct: fileType [ 1207.169994] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1207.170379] nova-compute[62208]: DEBUG nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1207.170558] nova-compute[62208]: DEBUG nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1207.170734] nova-compute[62208]: DEBUG nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1207.170921] nova-compute[62208]: DEBUG nova.network.neutron [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1207.502744] nova-compute[62208]: DEBUG nova.network.neutron [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1207.522191] nova-compute[62208]: INFO nova.compute.manager [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Took 0.35 seconds to deallocate network for instance. [ 1207.647606] nova-compute[62208]: INFO nova.scheduler.client.report [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Deleted allocations for instance af8885cb-afba-4724-be10-083e16f8bfc4 [ 1207.673861] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b9ee6d24-9a40-47f1-b39f-8fc795e35ed1 tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "af8885cb-afba-4724-be10-083e16f8bfc4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 597.425s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.675769] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ed1b3df-2aa7-4a98-ae97-128791dd805c tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "af8885cb-afba-4724-be10-083e16f8bfc4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 397.571s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1207.676069] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ed1b3df-2aa7-4a98-ae97-128791dd805c tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Acquiring lock "af8885cb-afba-4724-be10-083e16f8bfc4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1207.676309] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ed1b3df-2aa7-4a98-ae97-128791dd805c tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "af8885cb-afba-4724-be10-083e16f8bfc4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1207.676493] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ed1b3df-2aa7-4a98-ae97-128791dd805c tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "af8885cb-afba-4724-be10-083e16f8bfc4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.678668] nova-compute[62208]: INFO nova.compute.manager [None req-4ed1b3df-2aa7-4a98-ae97-128791dd805c tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Terminating instance [ 1207.680654] nova-compute[62208]: DEBUG nova.compute.manager [None req-4ed1b3df-2aa7-4a98-ae97-128791dd805c tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1207.680865] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ed1b3df-2aa7-4a98-ae97-128791dd805c tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1207.681410] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-09a7d902-ead0-40bf-a727-b40d41a0d12d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.683496] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1207.683496] nova-compute[62208]: warnings.warn( [ 1207.691349] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4df54c1-3800-46a6-a53d-7c06e4d2e0b0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1207.704234] nova-compute[62208]: DEBUG nova.compute.manager [None req-20ff9ed8-8b8b-4967-a5d9-1b0e1faf3a30 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 79777c07-535b-43ca-9d49-7d595da14adc] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1207.706566] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1207.706566] nova-compute[62208]: warnings.warn( [ 1207.727507] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-4ed1b3df-2aa7-4a98-ae97-128791dd805c tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance af8885cb-afba-4724-be10-083e16f8bfc4 could not be found. [ 1207.727757] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ed1b3df-2aa7-4a98-ae97-128791dd805c tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1207.728137] nova-compute[62208]: INFO nova.compute.manager [None req-4ed1b3df-2aa7-4a98-ae97-128791dd805c tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1207.728428] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-4ed1b3df-2aa7-4a98-ae97-128791dd805c tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1207.728809] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1207.729076] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1207.733414] nova-compute[62208]: DEBUG nova.compute.manager [None req-20ff9ed8-8b8b-4967-a5d9-1b0e1faf3a30 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 79777c07-535b-43ca-9d49-7d595da14adc] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1207.756263] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1207.765916] nova-compute[62208]: INFO nova.compute.manager [-] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] Took 0.04 seconds to deallocate network for instance. [ 1207.771780] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-20ff9ed8-8b8b-4967-a5d9-1b0e1faf3a30 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "79777c07-535b-43ca-9d49-7d595da14adc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 236.919s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.787418] nova-compute[62208]: DEBUG nova.compute.manager [None req-76e638d2-c0a1-41d2-a003-4ee13e404c94 tempest-ImagesNegativeTestJSON-1148079054 tempest-ImagesNegativeTestJSON-1148079054-project-member] [instance: 6f86dbc6-cfa0-422f-8c80-93c47a7764df] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1207.823569] nova-compute[62208]: DEBUG nova.compute.manager [None req-76e638d2-c0a1-41d2-a003-4ee13e404c94 tempest-ImagesNegativeTestJSON-1148079054 tempest-ImagesNegativeTestJSON-1148079054-project-member] [instance: 6f86dbc6-cfa0-422f-8c80-93c47a7764df] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1207.856496] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-76e638d2-c0a1-41d2-a003-4ee13e404c94 tempest-ImagesNegativeTestJSON-1148079054 tempest-ImagesNegativeTestJSON-1148079054-project-member] Lock "6f86dbc6-cfa0-422f-8c80-93c47a7764df" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 206.452s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.881652] nova-compute[62208]: DEBUG nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1207.889300] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ed1b3df-2aa7-4a98-ae97-128791dd805c tempest-ServersAdminTestJSON-1987698982 tempest-ServersAdminTestJSON-1987698982-project-member] Lock "af8885cb-afba-4724-be10-083e16f8bfc4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.214s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.890317] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "af8885cb-afba-4724-be10-083e16f8bfc4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 33.469s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1207.890566] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: af8885cb-afba-4724-be10-083e16f8bfc4] During sync_power_state the instance has a pending task (deleting). Skip. [ 1207.890783] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "af8885cb-afba-4724-be10-083e16f8bfc4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1207.944471] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1207.944724] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1207.946140] nova-compute[62208]: INFO nova.compute.claims [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1208.285658] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0429fdb-277b-4d1e-b061-e448c16ac61b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.288544] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1208.288544] nova-compute[62208]: warnings.warn( [ 1208.293865] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58d15bef-1497-47ff-8bea-c31cf7643973 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.296888] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1208.296888] nova-compute[62208]: warnings.warn( [ 1208.324282] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd2217d2-0168-4d60-9501-cf924af9d123 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.326709] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1208.326709] nova-compute[62208]: warnings.warn( [ 1208.332347] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3593b8b4-c010-4162-b327-a787b72010ab {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.336374] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1208.336374] nova-compute[62208]: warnings.warn( [ 1208.346088] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1208.355402] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1208.375981] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.429s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1208.375981] nova-compute[62208]: DEBUG nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1208.410336] nova-compute[62208]: DEBUG nova.compute.utils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1208.415951] nova-compute[62208]: DEBUG nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1208.416206] nova-compute[62208]: DEBUG nova.network.neutron [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1208.422477] nova-compute[62208]: DEBUG nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1208.500264] nova-compute[62208]: DEBUG nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1208.503434] nova-compute[62208]: DEBUG nova.policy [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2df841e548dc40349fa4f0d8e5dffd85', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7de608ed8dbd42b29b2a1da85885ed92', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1208.523997] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1208.524260] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1208.524419] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1208.524601] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1208.524750] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1208.524929] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1208.525145] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1208.525332] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1208.525509] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1208.525672] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1208.525847] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1208.526718] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58edcf5e-72eb-4229-8ed3-2ecd84f67404 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.529206] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1208.529206] nova-compute[62208]: warnings.warn( [ 1208.535170] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f9e9fe0-a302-4915-88b6-a0a301b09259 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.538992] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1208.538992] nova-compute[62208]: warnings.warn( [ 1208.893215] nova-compute[62208]: DEBUG nova.network.neutron [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Successfully created port: 1aeb195c-f802-4345-8127-9f56fc1397e5 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1208.906037] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1209.265005] nova-compute[62208]: DEBUG nova.network.neutron [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Successfully created port: eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1209.645036] nova-compute[62208]: DEBUG nova.network.neutron [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Successfully created port: 35098085-927f-4513-9778-197227df8aea {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1210.600837] nova-compute[62208]: DEBUG nova.network.neutron [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Successfully updated port: 1aeb195c-f802-4345-8127-9f56fc1397e5 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1210.761529] nova-compute[62208]: DEBUG nova.compute.manager [req-dfb890e0-0c8b-4073-9c47-b857e22a5e57 req-1408e988-c998-4290-b2db-7730f1ee0b20 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Received event network-vif-plugged-1aeb195c-f802-4345-8127-9f56fc1397e5 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1210.761757] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-dfb890e0-0c8b-4073-9c47-b857e22a5e57 req-1408e988-c998-4290-b2db-7730f1ee0b20 service nova] Acquiring lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1210.761966] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-dfb890e0-0c8b-4073-9c47-b857e22a5e57 req-1408e988-c998-4290-b2db-7730f1ee0b20 service nova] Lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1210.762161] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-dfb890e0-0c8b-4073-9c47-b857e22a5e57 req-1408e988-c998-4290-b2db-7730f1ee0b20 service nova] Lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1210.762339] nova-compute[62208]: DEBUG nova.compute.manager [req-dfb890e0-0c8b-4073-9c47-b857e22a5e57 req-1408e988-c998-4290-b2db-7730f1ee0b20 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] No waiting events found dispatching network-vif-plugged-1aeb195c-f802-4345-8127-9f56fc1397e5 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1210.762504] nova-compute[62208]: WARNING nova.compute.manager [req-dfb890e0-0c8b-4073-9c47-b857e22a5e57 req-1408e988-c998-4290-b2db-7730f1ee0b20 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Received unexpected event network-vif-plugged-1aeb195c-f802-4345-8127-9f56fc1397e5 for instance with vm_state building and task_state deleting. [ 1211.401194] nova-compute[62208]: DEBUG nova.network.neutron [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Successfully updated port: eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1212.344253] nova-compute[62208]: DEBUG nova.network.neutron [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Successfully updated port: 35098085-927f-4513-9778-197227df8aea {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1212.356845] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1212.357040] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquired lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1212.357196] nova-compute[62208]: DEBUG nova.network.neutron [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1212.401219] nova-compute[62208]: DEBUG nova.network.neutron [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1212.941569] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "f000e638-100f-4a53-853d-4a94ffe71bed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1212.941825] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "f000e638-100f-4a53-853d-4a94ffe71bed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1212.976765] nova-compute[62208]: DEBUG nova.compute.manager [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Received event network-changed-1aeb195c-f802-4345-8127-9f56fc1397e5 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1212.977130] nova-compute[62208]: DEBUG nova.compute.manager [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Refreshing instance network info cache due to event network-changed-1aeb195c-f802-4345-8127-9f56fc1397e5. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1212.977356] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Acquiring lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1213.164865] nova-compute[62208]: DEBUG nova.network.neutron [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Updating instance_info_cache with network_info: [{"id": "1aeb195c-f802-4345-8127-9f56fc1397e5", "address": "fa:16:3e:ec:a0:41", "network": {"id": "526b2a08-d0b3-4e0d-b0a0-b0238cf835d6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-715164930", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1aeb195c-f8", "ovs_interfaceid": "1aeb195c-f802-4345-8127-9f56fc1397e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1", "address": "fa:16:3e:be:a2:c1", "network": {"id": "0168bef2-b016-4af4-be08-b9b96acd6970", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-881350884", "subnets": [{"cidr": "10.0.0.16/28", "dns": [], "gateway": {"address": "10.0.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.18"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "419a5b3f-4c6f-4168-9def-746b4d8c5c24", "external-id": "nsx-vlan-transportzone-656", "segmentation_id": 656, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeb5dbe91-84", "ovs_interfaceid": "eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "35098085-927f-4513-9778-197227df8aea", "address": "fa:16:3e:ad:db:20", "network": {"id": "526b2a08-d0b3-4e0d-b0a0-b0238cf835d6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-715164930", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap35098085-92", "ovs_interfaceid": "35098085-927f-4513-9778-197227df8aea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1213.180216] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Releasing lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1213.180771] nova-compute[62208]: DEBUG nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Instance network_info: |[{"id": "1aeb195c-f802-4345-8127-9f56fc1397e5", "address": "fa:16:3e:ec:a0:41", "network": {"id": "526b2a08-d0b3-4e0d-b0a0-b0238cf835d6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-715164930", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1aeb195c-f8", "ovs_interfaceid": "1aeb195c-f802-4345-8127-9f56fc1397e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1", "address": "fa:16:3e:be:a2:c1", "network": {"id": "0168bef2-b016-4af4-be08-b9b96acd6970", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-881350884", "subnets": [{"cidr": "10.0.0.16/28", "dns": [], "gateway": {"address": "10.0.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.18"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "419a5b3f-4c6f-4168-9def-746b4d8c5c24", "external-id": "nsx-vlan-transportzone-656", "segmentation_id": 656, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeb5dbe91-84", "ovs_interfaceid": "eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "35098085-927f-4513-9778-197227df8aea", "address": "fa:16:3e:ad:db:20", "network": {"id": "526b2a08-d0b3-4e0d-b0a0-b0238cf835d6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-715164930", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap35098085-92", "ovs_interfaceid": "35098085-927f-4513-9778-197227df8aea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1213.181113] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Acquired lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1213.181292] nova-compute[62208]: DEBUG nova.network.neutron [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Refreshing network info cache for port 1aeb195c-f802-4345-8127-9f56fc1397e5 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1213.182999] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ec:a0:41', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6046aec4-feda-4ef9-bf4a-800de1e0cd3b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1aeb195c-f802-4345-8127-9f56fc1397e5', 'vif_model': 'e1000'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:be:a2:c1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '419a5b3f-4c6f-4168-9def-746b4d8c5c24', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1', 'vif_model': 'e1000'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:ad:db:20', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6046aec4-feda-4ef9-bf4a-800de1e0cd3b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '35098085-927f-4513-9778-197227df8aea', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1213.194987] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Creating folder: Project (7de608ed8dbd42b29b2a1da85885ed92). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1213.205702] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1b9f121c-7d8e-410d-96a0-a5de1dae6224 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.208252] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1213.208252] nova-compute[62208]: warnings.warn( [ 1213.218736] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Created folder: Project (7de608ed8dbd42b29b2a1da85885ed92) in parent group-v17427. [ 1213.218932] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Creating folder: Instances. Parent ref: group-v17520. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1213.219168] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c6f74cb4-bfd1-4137-9d77-0fbca53c257e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.220798] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1213.220798] nova-compute[62208]: warnings.warn( [ 1213.229603] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Created folder: Instances in parent group-v17520. [ 1213.229854] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1213.230047] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1213.230253] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4084b45a-f550-4096-bac3-6eee098fd32a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.255422] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1213.255422] nova-compute[62208]: warnings.warn( [ 1213.261953] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1213.261953] nova-compute[62208]: value = "task-38548" [ 1213.261953] nova-compute[62208]: _type = "Task" [ 1213.261953] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1213.266385] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1213.266385] nova-compute[62208]: warnings.warn( [ 1213.273086] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38548, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1213.766874] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1213.766874] nova-compute[62208]: warnings.warn( [ 1213.772608] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38548, 'name': CreateVM_Task, 'duration_secs': 0.404269} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1213.772801] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1213.787079] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1213.787326] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1213.792375] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-206ec897-a58c-4fa6-9196-652e0f536436 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.801851] nova-compute[62208]: DEBUG nova.network.neutron [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Updated VIF entry in instance network info cache for port 1aeb195c-f802-4345-8127-9f56fc1397e5. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1213.802626] nova-compute[62208]: DEBUG nova.network.neutron [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Updating instance_info_cache with network_info: [{"id": "1aeb195c-f802-4345-8127-9f56fc1397e5", "address": "fa:16:3e:ec:a0:41", "network": {"id": "526b2a08-d0b3-4e0d-b0a0-b0238cf835d6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-715164930", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1aeb195c-f8", "ovs_interfaceid": "1aeb195c-f802-4345-8127-9f56fc1397e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1", "address": "fa:16:3e:be:a2:c1", "network": {"id": "0168bef2-b016-4af4-be08-b9b96acd6970", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-881350884", "subnets": [{"cidr": "10.0.0.16/28", "dns": [], "gateway": {"address": "10.0.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.18"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "419a5b3f-4c6f-4168-9def-746b4d8c5c24", "external-id": "nsx-vlan-transportzone-656", "segmentation_id": 656, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeb5dbe91-84", "ovs_interfaceid": "eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "35098085-927f-4513-9778-197227df8aea", "address": "fa:16:3e:ad:db:20", "network": {"id": "526b2a08-d0b3-4e0d-b0a0-b0238cf835d6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-715164930", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap35098085-92", "ovs_interfaceid": "35098085-927f-4513-9778-197227df8aea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1213.803910] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1213.803910] nova-compute[62208]: warnings.warn( [ 1213.813456] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Releasing lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1213.813733] nova-compute[62208]: DEBUG nova.compute.manager [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Received event network-vif-plugged-eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1213.813929] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Acquiring lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1213.814124] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1213.814279] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.814437] nova-compute[62208]: DEBUG nova.compute.manager [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] No waiting events found dispatching network-vif-plugged-eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1213.814599] nova-compute[62208]: WARNING nova.compute.manager [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Received unexpected event network-vif-plugged-eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1 for instance with vm_state building and task_state deleting. [ 1213.814764] nova-compute[62208]: DEBUG nova.compute.manager [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Received event network-changed-eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1213.814920] nova-compute[62208]: DEBUG nova.compute.manager [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Refreshing instance network info cache due to event network-changed-eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1213.815111] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Acquiring lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1213.815222] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Acquired lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1213.815370] nova-compute[62208]: DEBUG nova.network.neutron [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Refreshing network info cache for port eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1213.829759] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Reconfiguring VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1213.830617] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-66720341-87a9-488f-b0cd-c4b2517ebf9e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.844510] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1213.844510] nova-compute[62208]: warnings.warn( [ 1213.852590] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Waiting for the task: (returnval){ [ 1213.852590] nova-compute[62208]: value = "task-38549" [ 1213.852590] nova-compute[62208]: _type = "Task" [ 1213.852590] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1213.854409] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1213.854409] nova-compute[62208]: warnings.warn( [ 1213.860236] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Task: {'id': task-38549, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1214.293384] nova-compute[62208]: DEBUG nova.network.neutron [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Updated VIF entry in instance network info cache for port eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1214.294071] nova-compute[62208]: DEBUG nova.network.neutron [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Updating instance_info_cache with network_info: [{"id": "1aeb195c-f802-4345-8127-9f56fc1397e5", "address": "fa:16:3e:ec:a0:41", "network": {"id": "526b2a08-d0b3-4e0d-b0a0-b0238cf835d6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-715164930", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1aeb195c-f8", "ovs_interfaceid": "1aeb195c-f802-4345-8127-9f56fc1397e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1", "address": "fa:16:3e:be:a2:c1", "network": {"id": "0168bef2-b016-4af4-be08-b9b96acd6970", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-881350884", "subnets": [{"cidr": "10.0.0.16/28", "dns": [], "gateway": {"address": "10.0.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.18"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "419a5b3f-4c6f-4168-9def-746b4d8c5c24", "external-id": "nsx-vlan-transportzone-656", "segmentation_id": 656, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeb5dbe91-84", "ovs_interfaceid": "eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "35098085-927f-4513-9778-197227df8aea", "address": "fa:16:3e:ad:db:20", "network": {"id": "526b2a08-d0b3-4e0d-b0a0-b0238cf835d6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-715164930", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap35098085-92", "ovs_interfaceid": "35098085-927f-4513-9778-197227df8aea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1214.305621] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Releasing lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1214.306205] nova-compute[62208]: DEBUG nova.compute.manager [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Received event network-vif-plugged-35098085-927f-4513-9778-197227df8aea {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1214.306205] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Acquiring lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1214.306534] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1214.306534] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1214.306592] nova-compute[62208]: DEBUG nova.compute.manager [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] No waiting events found dispatching network-vif-plugged-35098085-927f-4513-9778-197227df8aea {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1214.306732] nova-compute[62208]: WARNING nova.compute.manager [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Received unexpected event network-vif-plugged-35098085-927f-4513-9778-197227df8aea for instance with vm_state building and task_state deleting. [ 1214.306896] nova-compute[62208]: DEBUG nova.compute.manager [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Received event network-changed-35098085-927f-4513-9778-197227df8aea {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1214.307047] nova-compute[62208]: DEBUG nova.compute.manager [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Refreshing instance network info cache due to event network-changed-35098085-927f-4513-9778-197227df8aea. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1214.308698] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Acquiring lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1214.308698] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Acquired lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1214.308698] nova-compute[62208]: DEBUG nova.network.neutron [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Refreshing network info cache for port 35098085-927f-4513-9778-197227df8aea {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1214.357002] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1214.357002] nova-compute[62208]: warnings.warn( [ 1214.362857] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Task: {'id': task-38549, 'name': ReconfigVM_Task, 'duration_secs': 0.123902} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1214.363143] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Reconfigured VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1214.363354] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.576s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1214.363607] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1214.363809] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1214.364161] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1214.364420] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-da1dfacb-a483-4088-ad5a-520fdf4e4f79 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1214.365974] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1214.365974] nova-compute[62208]: warnings.warn( [ 1214.369486] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Waiting for the task: (returnval){ [ 1214.369486] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52367652-ebc8-cbe4-0a62-746ab0e81d74" [ 1214.369486] nova-compute[62208]: _type = "Task" [ 1214.369486] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1214.372827] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1214.372827] nova-compute[62208]: warnings.warn( [ 1214.377801] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52367652-ebc8-cbe4-0a62-746ab0e81d74, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1214.876192] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1214.876192] nova-compute[62208]: warnings.warn( [ 1214.883016] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1214.883575] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1214.883853] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1215.148370] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1215.148691] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1215.166267] nova-compute[62208]: DEBUG nova.network.neutron [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Updated VIF entry in instance network info cache for port 35098085-927f-4513-9778-197227df8aea. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1215.166743] nova-compute[62208]: DEBUG nova.network.neutron [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Updating instance_info_cache with network_info: [{"id": "1aeb195c-f802-4345-8127-9f56fc1397e5", "address": "fa:16:3e:ec:a0:41", "network": {"id": "526b2a08-d0b3-4e0d-b0a0-b0238cf835d6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-715164930", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1aeb195c-f8", "ovs_interfaceid": "1aeb195c-f802-4345-8127-9f56fc1397e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1", "address": "fa:16:3e:be:a2:c1", "network": {"id": "0168bef2-b016-4af4-be08-b9b96acd6970", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-881350884", "subnets": [{"cidr": "10.0.0.16/28", "dns": [], "gateway": {"address": "10.0.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.18"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "419a5b3f-4c6f-4168-9def-746b4d8c5c24", "external-id": "nsx-vlan-transportzone-656", "segmentation_id": 656, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeb5dbe91-84", "ovs_interfaceid": "eb5dbe91-8472-4bde-92e0-1d0b4b8fb0e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "35098085-927f-4513-9778-197227df8aea", "address": "fa:16:3e:ad:db:20", "network": {"id": "526b2a08-d0b3-4e0d-b0a0-b0238cf835d6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-715164930", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap35098085-92", "ovs_interfaceid": "35098085-927f-4513-9778-197227df8aea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1215.177051] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9830f4e1-aed4-43fe-ad18-fc30e0465450 req-d71f6f31-13fa-4595-96e9-e12264af79a7 service nova] Releasing lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1218.293797] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1218.294035] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1220.032250] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2f1ef51f-3ee9-45d8-9653-3c39d9d37dd5 tempest-ServerPasswordTestJSON-653179953 tempest-ServerPasswordTestJSON-653179953-project-member] Acquiring lock "795665a3-58eb-4d8a-bedc-84e399e11bb7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1220.032904] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2f1ef51f-3ee9-45d8-9653-3c39d9d37dd5 tempest-ServerPasswordTestJSON-653179953 tempest-ServerPasswordTestJSON-653179953-project-member] Lock "795665a3-58eb-4d8a-bedc-84e399e11bb7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1222.132861] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8e075d5e-0318-4727-bff8-d9a0320a1e9f tempest-ServerAddressesTestJSON-433221683 tempest-ServerAddressesTestJSON-433221683-project-member] Acquiring lock "a72bbc63-b475-4be7-a412-f0f893a094f4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1222.133223] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8e075d5e-0318-4727-bff8-d9a0320a1e9f tempest-ServerAddressesTestJSON-433221683 tempest-ServerAddressesTestJSON-433221683-project-member] Lock "a72bbc63-b475-4be7-a412-f0f893a094f4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1237.143714] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1241.141287] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1241.141552] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1241.141588] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1241.161995] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1241.162160] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1241.162291] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1241.162416] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1241.162537] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1241.162658] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1241.162778] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1241.162896] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1241.163012] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1241.163126] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1241.163244] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1241.163887] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1241.163994] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1242.140602] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1242.150486] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1242.150760] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1242.150899] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1242.151039] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1242.152181] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e254bda-4cf0-4fb2-ae77-7ea4681dffb4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1242.155022] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1242.155022] nova-compute[62208]: warnings.warn( [ 1242.161139] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bce0322-bec1-450c-a051-69829594a9da {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1242.164755] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1242.164755] nova-compute[62208]: warnings.warn( [ 1242.177123] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae90d230-bc2b-4940-99e8-289ed2209088 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1242.179973] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1242.179973] nova-compute[62208]: warnings.warn( [ 1242.184720] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14683f58-9d01-4c72-b59d-941e8a0be90f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1242.187705] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1242.187705] nova-compute[62208]: warnings.warn( [ 1242.216695] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181988MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1242.216904] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1242.217059] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1242.283291] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1242.283464] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1242.283594] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1242.283716] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1242.283838] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c4512476-9905-4f33-8575-d0a0f24ed4d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1242.283966] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1242.284093] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d5c7531e-b496-4aed-be05-f1a96391e327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1242.284213] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d1a22d6e-d913-47de-9188-507d2475f745 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1242.284329] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7f79eba6-e15c-4402-b46b-028d552a81d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1242.284443] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8f74dd97-b43c-49ef-8a83-401329ebfbdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1242.295002] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1242.305865] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c0d7e5a6-e905-47ee-87d7-cda8543be1f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1242.316098] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4d29dc3e-1090-49fd-83b7-96b8e6855ede has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1242.327455] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e45ad927-7d07-43d5-84b8-339c68981de6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1242.337244] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f000e638-100f-4a53-853d-4a94ffe71bed has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1242.346893] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bbb642f2-aa5c-4e71-b25b-e32acd45f879 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1242.356182] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1242.365167] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 795665a3-58eb-4d8a-bedc-84e399e11bb7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1242.373898] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance a72bbc63-b475-4be7-a412-f0f893a094f4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1242.374197] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1242.374347] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1242.598657] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfe0777e-df44-4955-bab7-3dcbdbca5cf6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1242.601135] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1242.601135] nova-compute[62208]: warnings.warn( [ 1242.607194] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ed5e898-8b78-412a-8d5c-00b25946fef5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1242.610101] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1242.610101] nova-compute[62208]: warnings.warn( [ 1242.636393] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82a7c670-a87a-4e59-9742-ee3040febd57 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1242.638849] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1242.638849] nova-compute[62208]: warnings.warn( [ 1242.644258] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05ba15f8-fd6a-4199-951c-2c1863dbab91 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1242.647864] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1242.647864] nova-compute[62208]: warnings.warn( [ 1242.657066] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1242.665366] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1242.682514] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1242.682702] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.466s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1243.683656] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1243.683893] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1244.136568] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1245.136397] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1245.157714] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1245.157872] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1255.430778] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1255.430778] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1255.430778] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1255.430778] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1255.430778] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1255.430778] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1255.430778] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1255.430778] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1255.430778] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1255.430778] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1255.430778] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1255.430778] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1255.431407] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/bb3d6f9c-41ae-4561-8814-0664178b815d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1255.433230] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1255.433497] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Copying Virtual Disk [datastore2] vmware_temp/bb3d6f9c-41ae-4561-8814-0664178b815d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/bb3d6f9c-41ae-4561-8814-0664178b815d/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1255.433803] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-af87ed5b-8a94-4935-8475-9ef050562f7f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1255.436065] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1255.436065] nova-compute[62208]: warnings.warn( [ 1255.443044] nova-compute[62208]: DEBUG oslo_vmware.api [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Waiting for the task: (returnval){ [ 1255.443044] nova-compute[62208]: value = "task-38550" [ 1255.443044] nova-compute[62208]: _type = "Task" [ 1255.443044] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1255.446829] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1255.446829] nova-compute[62208]: warnings.warn( [ 1255.452000] nova-compute[62208]: DEBUG oslo_vmware.api [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Task: {'id': task-38550, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1255.948060] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1255.948060] nova-compute[62208]: warnings.warn( [ 1255.955094] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1255.955389] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1255.955969] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Traceback (most recent call last): [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] yield resources [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] self.driver.spawn(context, instance, image_meta, [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] self._fetch_image_if_missing(context, vi) [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] image_cache(vi, tmp_image_ds_loc) [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] vm_util.copy_virtual_disk( [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] session._wait_for_task(vmdk_copy_task) [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] return self.wait_for_task(task_ref) [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] return evt.wait() [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] result = hub.switch() [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] return self.greenlet.switch() [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] self.f(*self.args, **self.kw) [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] raise exceptions.translate_fault(task_info.error) [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Faults: ['InvalidArgument'] [ 1255.955969] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] [ 1255.956768] nova-compute[62208]: INFO nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Terminating instance [ 1255.957875] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1255.958111] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1255.958355] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-54052895-68db-4b7a-aef6-958da2c25e78 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1255.960540] nova-compute[62208]: DEBUG nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1255.960729] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1255.961441] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc4914cf-c913-4996-b7bf-8f73c9d494c7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1255.963843] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1255.963843] nova-compute[62208]: warnings.warn( [ 1255.964221] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1255.964221] nova-compute[62208]: warnings.warn( [ 1255.968583] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1255.968839] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ddd0d333-2d21-4d7c-88af-35b87195e978 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1255.970964] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1255.971129] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1255.971688] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1255.971688] nova-compute[62208]: warnings.warn( [ 1255.972108] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-836319f5-de3a-4901-beca-bd1ed9d9b45e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1255.974176] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1255.974176] nova-compute[62208]: warnings.warn( [ 1255.976955] nova-compute[62208]: DEBUG oslo_vmware.api [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Waiting for the task: (returnval){ [ 1255.976955] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52bb337b-4861-553e-5eca-469dc334c805" [ 1255.976955] nova-compute[62208]: _type = "Task" [ 1255.976955] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1255.979804] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1255.979804] nova-compute[62208]: warnings.warn( [ 1255.984407] nova-compute[62208]: DEBUG oslo_vmware.api [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52bb337b-4861-553e-5eca-469dc334c805, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1256.048643] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1256.048895] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1256.049092] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Deleting the datastore file [datastore2] 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1256.049350] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-70638a28-0cfc-480c-b110-94036c556a77 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.051132] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.051132] nova-compute[62208]: warnings.warn( [ 1256.055717] nova-compute[62208]: DEBUG oslo_vmware.api [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Waiting for the task: (returnval){ [ 1256.055717] nova-compute[62208]: value = "task-38552" [ 1256.055717] nova-compute[62208]: _type = "Task" [ 1256.055717] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1256.058845] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.058845] nova-compute[62208]: warnings.warn( [ 1256.063581] nova-compute[62208]: DEBUG oslo_vmware.api [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Task: {'id': task-38552, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1256.481810] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.481810] nova-compute[62208]: warnings.warn( [ 1256.488159] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1256.488440] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Creating directory with path [datastore2] vmware_temp/1599910c-ea51-43b2-9e90-28937e98d1f8/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1256.488669] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-38175051-dfe2-4eb1-ab75-65beca7244fb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.490601] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.490601] nova-compute[62208]: warnings.warn( [ 1256.500534] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Created directory with path [datastore2] vmware_temp/1599910c-ea51-43b2-9e90-28937e98d1f8/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1256.500734] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Fetch image to [datastore2] vmware_temp/1599910c-ea51-43b2-9e90-28937e98d1f8/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1256.500905] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/1599910c-ea51-43b2-9e90-28937e98d1f8/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1256.501651] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06690410-877c-4426-b4e0-d9837c05abe9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.503936] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.503936] nova-compute[62208]: warnings.warn( [ 1256.509303] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04689e47-e0d7-43db-8a9c-c61c0e74ca43 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.511466] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.511466] nova-compute[62208]: warnings.warn( [ 1256.518273] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94ad0d57-aa3d-4f11-bc7d-4c942198cbf1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.521706] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.521706] nova-compute[62208]: warnings.warn( [ 1256.551412] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-289ed5d1-6abd-4eb5-8837-8d8c316eebe6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.553725] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.553725] nova-compute[62208]: warnings.warn( [ 1256.560041] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-235deba4-4b11-4bc9-8105-b0b174c91a17 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.561627] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.561627] nova-compute[62208]: warnings.warn( [ 1256.561930] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.561930] nova-compute[62208]: warnings.warn( [ 1256.566529] nova-compute[62208]: DEBUG oslo_vmware.api [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Task: {'id': task-38552, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066878} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1256.566807] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1256.566997] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1256.567171] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1256.567360] nova-compute[62208]: INFO nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1256.569598] nova-compute[62208]: DEBUG nova.compute.claims [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936954340> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1256.569767] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1256.569984] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1256.580472] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1256.754784] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1599910c-ea51-43b2-9e90-28937e98d1f8/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1256.810884] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1256.811078] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1599910c-ea51-43b2-9e90-28937e98d1f8/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1256.911577] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a7f3636-02ca-4fb3-aea2-cd9c0047351e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.914093] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.914093] nova-compute[62208]: warnings.warn( [ 1256.919493] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f8d1980-6d7c-4a59-908e-8a57d2098906 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.922390] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.922390] nova-compute[62208]: warnings.warn( [ 1256.948749] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3d55d3c-21b9-4269-b84f-e41414494604 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.951273] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.951273] nova-compute[62208]: warnings.warn( [ 1256.957618] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-747854ae-4572-43fb-8e10-7a6c5a4da797 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.961526] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1256.961526] nova-compute[62208]: warnings.warn( [ 1256.971366] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1256.980705] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1256.997721] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.427s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1256.998379] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Traceback (most recent call last): [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] self.driver.spawn(context, instance, image_meta, [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] self._fetch_image_if_missing(context, vi) [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] image_cache(vi, tmp_image_ds_loc) [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] vm_util.copy_virtual_disk( [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] session._wait_for_task(vmdk_copy_task) [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] return self.wait_for_task(task_ref) [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] return evt.wait() [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] result = hub.switch() [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] return self.greenlet.switch() [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] self.f(*self.args, **self.kw) [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] raise exceptions.translate_fault(task_info.error) [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Faults: ['InvalidArgument'] [ 1256.998379] nova-compute[62208]: ERROR nova.compute.manager [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] [ 1256.999610] nova-compute[62208]: DEBUG nova.compute.utils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1257.001106] nova-compute[62208]: DEBUG nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Build of instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 was re-scheduled: A specified parameter was not correct: fileType [ 1257.001106] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1257.001566] nova-compute[62208]: DEBUG nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1257.001795] nova-compute[62208]: DEBUG nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1257.002025] nova-compute[62208]: DEBUG nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1257.002246] nova-compute[62208]: DEBUG nova.network.neutron [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1257.440434] nova-compute[62208]: DEBUG nova.network.neutron [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1257.455217] nova-compute[62208]: INFO nova.compute.manager [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Took 0.45 seconds to deallocate network for instance. [ 1257.560085] nova-compute[62208]: INFO nova.scheduler.client.report [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Deleted allocations for instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 [ 1257.589964] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-95286149-51de-4d02-8e5e-cdeb294ddae9 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 646.711s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1257.591724] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cca59baa-f581-4a31-8ae5-a0373e50c294 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 449.634s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1257.592050] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cca59baa-f581-4a31-8ae5-a0373e50c294 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Acquiring lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1257.592298] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cca59baa-f581-4a31-8ae5-a0373e50c294 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1257.592469] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cca59baa-f581-4a31-8ae5-a0373e50c294 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1257.594515] nova-compute[62208]: INFO nova.compute.manager [None req-cca59baa-f581-4a31-8ae5-a0373e50c294 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Terminating instance [ 1257.596294] nova-compute[62208]: DEBUG nova.compute.manager [None req-cca59baa-f581-4a31-8ae5-a0373e50c294 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1257.596437] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cca59baa-f581-4a31-8ae5-a0373e50c294 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1257.596908] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cb5bc5c2-1cb2-4562-9c6b-d7203ae6fa7c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1257.600282] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1257.600282] nova-compute[62208]: warnings.warn( [ 1257.607728] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25ec32d7-0116-4aa6-a9a6-9a5e02866a73 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1257.619500] nova-compute[62208]: DEBUG nova.compute.manager [None req-f16f15a8-06f0-4368-8c66-a91235282f03 tempest-ServerActionsTestOtherB-1030249969 tempest-ServerActionsTestOtherB-1030249969-project-member] [instance: b21f3c77-3118-4229-96b4-7e242f0d5ad5] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1257.622158] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1257.622158] nova-compute[62208]: warnings.warn( [ 1257.641545] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-cca59baa-f581-4a31-8ae5-a0373e50c294 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 14c248a0-9f16-40a5-a8c2-06536fdd8cb7 could not be found. [ 1257.641802] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cca59baa-f581-4a31-8ae5-a0373e50c294 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1257.642005] nova-compute[62208]: INFO nova.compute.manager [None req-cca59baa-f581-4a31-8ae5-a0373e50c294 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1257.642223] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-cca59baa-f581-4a31-8ae5-a0373e50c294 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1257.642463] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1257.642554] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1257.646921] nova-compute[62208]: DEBUG nova.compute.manager [None req-f16f15a8-06f0-4368-8c66-a91235282f03 tempest-ServerActionsTestOtherB-1030249969 tempest-ServerActionsTestOtherB-1030249969-project-member] [instance: b21f3c77-3118-4229-96b4-7e242f0d5ad5] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1257.670268] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1257.677807] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] Took 0.04 seconds to deallocate network for instance. [ 1257.683284] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f16f15a8-06f0-4368-8c66-a91235282f03 tempest-ServerActionsTestOtherB-1030249969 tempest-ServerActionsTestOtherB-1030249969-project-member] Lock "b21f3c77-3118-4229-96b4-7e242f0d5ad5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 242.811s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1257.694006] nova-compute[62208]: DEBUG nova.compute.manager [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] [instance: 38c6aba9-f3ad-475b-b0d2-3feb30073826] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1257.730401] nova-compute[62208]: DEBUG nova.compute.manager [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] [instance: 38c6aba9-f3ad-475b-b0d2-3feb30073826] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1257.755592] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] Lock "38c6aba9-f3ad-475b-b0d2-3feb30073826" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 242.040s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1257.765997] nova-compute[62208]: DEBUG nova.compute.manager [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] [instance: 706adf2f-f3a5-4c70-bdaa-a31911d3fda8] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1257.798323] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cca59baa-f581-4a31-8ae5-a0373e50c294 tempest-ServersTestFqdnHostnames-913108469 tempest-ServersTestFqdnHostnames-913108469-project-member] Lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.207s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1257.799337] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 83.378s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1257.799491] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 14c248a0-9f16-40a5-a8c2-06536fdd8cb7] During sync_power_state the instance has a pending task (deleting). Skip. [ 1257.799727] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "14c248a0-9f16-40a5-a8c2-06536fdd8cb7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1257.801630] nova-compute[62208]: DEBUG nova.compute.manager [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] [instance: 706adf2f-f3a5-4c70-bdaa-a31911d3fda8] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1257.823379] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] Lock "706adf2f-f3a5-4c70-bdaa-a31911d3fda8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 242.065s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1257.835638] nova-compute[62208]: DEBUG nova.compute.manager [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] [instance: 228a512d-ee17-440e-88e2-023af55853c1] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1257.864186] nova-compute[62208]: DEBUG nova.compute.manager [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] [instance: 228a512d-ee17-440e-88e2-023af55853c1] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1257.886796] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8378c214-5f8e-4da0-a3e1-e318d606d194 tempest-ListServersNegativeTestJSON-971902274 tempest-ListServersNegativeTestJSON-971902274-project-member] Lock "228a512d-ee17-440e-88e2-023af55853c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 242.085s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1257.900361] nova-compute[62208]: DEBUG nova.compute.manager [None req-9f62ef70-5f8f-429b-848b-c8c763b758c5 tempest-InstanceActionsTestJSON-1658190985 tempest-InstanceActionsTestJSON-1658190985-project-member] [instance: 3c0deb5b-4eeb-45fb-a248-187734fb8820] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1257.927625] nova-compute[62208]: DEBUG nova.compute.manager [None req-9f62ef70-5f8f-429b-848b-c8c763b758c5 tempest-InstanceActionsTestJSON-1658190985 tempest-InstanceActionsTestJSON-1658190985-project-member] [instance: 3c0deb5b-4eeb-45fb-a248-187734fb8820] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1257.950445] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9f62ef70-5f8f-429b-848b-c8c763b758c5 tempest-InstanceActionsTestJSON-1658190985 tempest-InstanceActionsTestJSON-1658190985-project-member] Lock "3c0deb5b-4eeb-45fb-a248-187734fb8820" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 241.659s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1257.961614] nova-compute[62208]: DEBUG nova.compute.manager [None req-10a457f4-7093-4c34-96ef-2551f63550ce tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5fb62e8e-15de-4e18-a8e9-a5507e29a4bd] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1257.987021] nova-compute[62208]: DEBUG nova.compute.manager [None req-10a457f4-7093-4c34-96ef-2551f63550ce tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5fb62e8e-15de-4e18-a8e9-a5507e29a4bd] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1258.010029] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10a457f4-7093-4c34-96ef-2551f63550ce tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "5fb62e8e-15de-4e18-a8e9-a5507e29a4bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 240.967s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1258.021451] nova-compute[62208]: DEBUG nova.compute.manager [None req-b2ecdd67-6d41-43a6-a4a6-9f9e52580430 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: bed3e7fa-24a1-4399-82de-4a128d655376] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1258.047978] nova-compute[62208]: DEBUG nova.compute.manager [None req-b2ecdd67-6d41-43a6-a4a6-9f9e52580430 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: bed3e7fa-24a1-4399-82de-4a128d655376] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1258.075044] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b2ecdd67-6d41-43a6-a4a6-9f9e52580430 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "bed3e7fa-24a1-4399-82de-4a128d655376" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 237.311s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1258.088455] nova-compute[62208]: DEBUG nova.compute.manager [None req-0c18eb50-6f0e-4caf-babe-32d8a6a2ca81 tempest-ServerGroupTestJSON-2076335407 tempest-ServerGroupTestJSON-2076335407-project-member] [instance: e01f0362-0594-471f-a5f9-b444fb774606] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1258.116906] nova-compute[62208]: DEBUG nova.compute.manager [None req-0c18eb50-6f0e-4caf-babe-32d8a6a2ca81 tempest-ServerGroupTestJSON-2076335407 tempest-ServerGroupTestJSON-2076335407-project-member] [instance: e01f0362-0594-471f-a5f9-b444fb774606] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1258.139764] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0c18eb50-6f0e-4caf-babe-32d8a6a2ca81 tempest-ServerGroupTestJSON-2076335407 tempest-ServerGroupTestJSON-2076335407-project-member] Lock "e01f0362-0594-471f-a5f9-b444fb774606" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 231.419s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1258.150745] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1258.210850] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1258.211273] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1258.212965] nova-compute[62208]: INFO nova.compute.claims [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1258.492338] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3826222f-1a70-44f5-9b33-5b97d3f86b21 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1258.495144] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1258.495144] nova-compute[62208]: warnings.warn( [ 1258.500249] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d500c9f-6208-48cc-84f5-72b0f2b41100 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1258.503733] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1258.503733] nova-compute[62208]: warnings.warn( [ 1258.531431] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e70f5b4f-f845-45e5-9bf2-01fb65b3e7ea {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1258.533789] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1258.533789] nova-compute[62208]: warnings.warn( [ 1258.538941] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8036beb-a1e1-4ef4-b3ef-7cd8bf8852f2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1258.542581] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1258.542581] nova-compute[62208]: warnings.warn( [ 1258.552294] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1258.560803] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1258.576671] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.365s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1258.576816] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1258.615510] nova-compute[62208]: DEBUG nova.compute.utils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1258.616910] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1258.617066] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1258.628376] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1258.689068] nova-compute[62208]: DEBUG nova.policy [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '94efa93df051433fa31280448e5bc4af', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0f817d636d704c348130b00a3ec0599d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1258.710901] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1258.733183] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1258.733423] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1258.733580] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1258.733753] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1258.733898] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1258.734046] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1258.734247] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1258.734406] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1258.734574] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1258.734733] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1258.734902] nova-compute[62208]: DEBUG nova.virt.hardware [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1258.735773] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e73312c-e59f-41bd-a3d5-91b866d6e0bd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1258.738338] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1258.738338] nova-compute[62208]: warnings.warn( [ 1258.744295] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-367aacb7-a4c1-4183-a7fa-ba73b5c9e4ea {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1258.748145] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1258.748145] nova-compute[62208]: warnings.warn( [ 1258.983566] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Successfully created port: 99ac5c7e-a89c-40cf-bc59-8dfad731560d {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1259.710984] nova-compute[62208]: DEBUG nova.compute.manager [req-859cffbc-667e-46f6-a765-a76cfc9032c1 req-5dfe9e6e-7e39-4238-910a-416397d45a71 service nova] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Received event network-vif-plugged-99ac5c7e-a89c-40cf-bc59-8dfad731560d {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1259.710984] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-859cffbc-667e-46f6-a765-a76cfc9032c1 req-5dfe9e6e-7e39-4238-910a-416397d45a71 service nova] Acquiring lock "8b4b316e-5e6b-4455-bd0e-016cb89b9b9a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1259.710984] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-859cffbc-667e-46f6-a765-a76cfc9032c1 req-5dfe9e6e-7e39-4238-910a-416397d45a71 service nova] Lock "8b4b316e-5e6b-4455-bd0e-016cb89b9b9a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1259.710984] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-859cffbc-667e-46f6-a765-a76cfc9032c1 req-5dfe9e6e-7e39-4238-910a-416397d45a71 service nova] Lock "8b4b316e-5e6b-4455-bd0e-016cb89b9b9a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1259.710984] nova-compute[62208]: DEBUG nova.compute.manager [req-859cffbc-667e-46f6-a765-a76cfc9032c1 req-5dfe9e6e-7e39-4238-910a-416397d45a71 service nova] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] No waiting events found dispatching network-vif-plugged-99ac5c7e-a89c-40cf-bc59-8dfad731560d {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1259.710984] nova-compute[62208]: WARNING nova.compute.manager [req-859cffbc-667e-46f6-a765-a76cfc9032c1 req-5dfe9e6e-7e39-4238-910a-416397d45a71 service nova] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Received unexpected event network-vif-plugged-99ac5c7e-a89c-40cf-bc59-8dfad731560d for instance with vm_state building and task_state spawning. [ 1259.795622] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Successfully updated port: 99ac5c7e-a89c-40cf-bc59-8dfad731560d {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1259.808630] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Acquiring lock "refresh_cache-8b4b316e-5e6b-4455-bd0e-016cb89b9b9a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1259.808788] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Acquired lock "refresh_cache-8b4b316e-5e6b-4455-bd0e-016cb89b9b9a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1259.808922] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1259.850651] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1259.995558] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Updating instance_info_cache with network_info: [{"id": "99ac5c7e-a89c-40cf-bc59-8dfad731560d", "address": "fa:16:3e:62:96:dd", "network": {"id": "2b6cbf64-1997-4364-a9aa-ce07f2c8635e", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-451857132-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "0f817d636d704c348130b00a3ec0599d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff3ecd2f-0b10-4faf-a512-fd7a20c28df1", "external-id": "nsx-vlan-transportzone-291", "segmentation_id": 291, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap99ac5c7e-a8", "ovs_interfaceid": "99ac5c7e-a89c-40cf-bc59-8dfad731560d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1260.010867] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Releasing lock "refresh_cache-8b4b316e-5e6b-4455-bd0e-016cb89b9b9a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1260.011177] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Instance network_info: |[{"id": "99ac5c7e-a89c-40cf-bc59-8dfad731560d", "address": "fa:16:3e:62:96:dd", "network": {"id": "2b6cbf64-1997-4364-a9aa-ce07f2c8635e", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-451857132-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "0f817d636d704c348130b00a3ec0599d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff3ecd2f-0b10-4faf-a512-fd7a20c28df1", "external-id": "nsx-vlan-transportzone-291", "segmentation_id": 291, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap99ac5c7e-a8", "ovs_interfaceid": "99ac5c7e-a89c-40cf-bc59-8dfad731560d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1260.011640] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:62:96:dd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff3ecd2f-0b10-4faf-a512-fd7a20c28df1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '99ac5c7e-a89c-40cf-bc59-8dfad731560d', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1260.019142] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Creating folder: Project (0f817d636d704c348130b00a3ec0599d). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1260.019811] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-63680483-91a1-4fb5-8986-e88abc5ea0bc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1260.021611] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1260.021611] nova-compute[62208]: warnings.warn( [ 1260.033033] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Created folder: Project (0f817d636d704c348130b00a3ec0599d) in parent group-v17427. [ 1260.033258] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Creating folder: Instances. Parent ref: group-v17523. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1260.033512] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8e33a742-8ee9-46ad-a6ae-2d7718101766 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1260.035512] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1260.035512] nova-compute[62208]: warnings.warn( [ 1260.047210] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Created folder: Instances in parent group-v17523. [ 1260.047509] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1260.047720] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1260.048052] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0816e0ff-8177-4ced-a7ff-844a511f86a9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1260.064488] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1260.064488] nova-compute[62208]: warnings.warn( [ 1260.070673] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1260.070673] nova-compute[62208]: value = "task-38555" [ 1260.070673] nova-compute[62208]: _type = "Task" [ 1260.070673] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1260.073907] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1260.073907] nova-compute[62208]: warnings.warn( [ 1260.079765] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38555, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1260.574624] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1260.574624] nova-compute[62208]: warnings.warn( [ 1260.580998] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38555, 'name': CreateVM_Task, 'duration_secs': 0.283894} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1260.581278] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1260.581983] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1260.582291] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1260.585231] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1d862f6-1cc3-4bb2-8fc3-893436598d66 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1260.596525] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1260.596525] nova-compute[62208]: warnings.warn( [ 1260.619753] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Reconfiguring VM instance to enable vnc on port - 5903 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1260.620251] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-d0d69da9-0e54-4416-b51d-82faff70b57c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1260.630444] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1260.630444] nova-compute[62208]: warnings.warn( [ 1260.637016] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Waiting for the task: (returnval){ [ 1260.637016] nova-compute[62208]: value = "task-38556" [ 1260.637016] nova-compute[62208]: _type = "Task" [ 1260.637016] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1260.640194] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1260.640194] nova-compute[62208]: warnings.warn( [ 1260.646266] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Task: {'id': task-38556, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1261.142479] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1261.142479] nova-compute[62208]: warnings.warn( [ 1261.147773] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Task: {'id': task-38556, 'name': ReconfigVM_Task, 'duration_secs': 0.108017} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1261.148271] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Reconfigured VM instance to enable vnc on port - 5903 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1261.148556] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.566s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1261.148919] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1261.149134] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1261.149508] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1261.149832] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-37893a7d-f2e2-4efb-b091-c6cc17447527 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1261.151570] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1261.151570] nova-compute[62208]: warnings.warn( [ 1261.155242] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Waiting for the task: (returnval){ [ 1261.155242] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]525df2ee-6f3a-1f82-1147-f81f13867fd9" [ 1261.155242] nova-compute[62208]: _type = "Task" [ 1261.155242] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1261.158433] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1261.158433] nova-compute[62208]: warnings.warn( [ 1261.167814] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]525df2ee-6f3a-1f82-1147-f81f13867fd9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1261.595168] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "b7855bc3-7f66-4755-b8bb-82604ae49df5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1261.595406] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "b7855bc3-7f66-4755-b8bb-82604ae49df5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1261.659352] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1261.659352] nova-compute[62208]: warnings.warn( [ 1261.666389] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1261.666658] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1261.666937] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1261.746672] nova-compute[62208]: DEBUG nova.compute.manager [req-71ed3aea-7a74-43bd-b4c6-01d707789af7 req-514f7c17-9a47-4e6a-a57e-e5d58568f013 service nova] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Received event network-changed-99ac5c7e-a89c-40cf-bc59-8dfad731560d {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1261.746894] nova-compute[62208]: DEBUG nova.compute.manager [req-71ed3aea-7a74-43bd-b4c6-01d707789af7 req-514f7c17-9a47-4e6a-a57e-e5d58568f013 service nova] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Refreshing instance network info cache due to event network-changed-99ac5c7e-a89c-40cf-bc59-8dfad731560d. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1261.747127] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-71ed3aea-7a74-43bd-b4c6-01d707789af7 req-514f7c17-9a47-4e6a-a57e-e5d58568f013 service nova] Acquiring lock "refresh_cache-8b4b316e-5e6b-4455-bd0e-016cb89b9b9a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1261.747561] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-71ed3aea-7a74-43bd-b4c6-01d707789af7 req-514f7c17-9a47-4e6a-a57e-e5d58568f013 service nova] Acquired lock "refresh_cache-8b4b316e-5e6b-4455-bd0e-016cb89b9b9a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1261.747741] nova-compute[62208]: DEBUG nova.network.neutron [req-71ed3aea-7a74-43bd-b4c6-01d707789af7 req-514f7c17-9a47-4e6a-a57e-e5d58568f013 service nova] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Refreshing network info cache for port 99ac5c7e-a89c-40cf-bc59-8dfad731560d {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1262.019625] nova-compute[62208]: DEBUG nova.network.neutron [req-71ed3aea-7a74-43bd-b4c6-01d707789af7 req-514f7c17-9a47-4e6a-a57e-e5d58568f013 service nova] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Updated VIF entry in instance network info cache for port 99ac5c7e-a89c-40cf-bc59-8dfad731560d. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1262.019977] nova-compute[62208]: DEBUG nova.network.neutron [req-71ed3aea-7a74-43bd-b4c6-01d707789af7 req-514f7c17-9a47-4e6a-a57e-e5d58568f013 service nova] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Updating instance_info_cache with network_info: [{"id": "99ac5c7e-a89c-40cf-bc59-8dfad731560d", "address": "fa:16:3e:62:96:dd", "network": {"id": "2b6cbf64-1997-4364-a9aa-ce07f2c8635e", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-451857132-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "0f817d636d704c348130b00a3ec0599d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff3ecd2f-0b10-4faf-a512-fd7a20c28df1", "external-id": "nsx-vlan-transportzone-291", "segmentation_id": 291, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap99ac5c7e-a8", "ovs_interfaceid": "99ac5c7e-a89c-40cf-bc59-8dfad731560d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1262.030009] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-71ed3aea-7a74-43bd-b4c6-01d707789af7 req-514f7c17-9a47-4e6a-a57e-e5d58568f013 service nova] Releasing lock "refresh_cache-8b4b316e-5e6b-4455-bd0e-016cb89b9b9a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1275.503952] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b471f9d-f735-4a8c-b869-a605092d41f0 tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Acquiring lock "8b4b316e-5e6b-4455-bd0e-016cb89b9b9a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1298.144796] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1303.140603] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1303.140894] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1303.140894] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1303.162183] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1303.162183] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1303.162183] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1303.162352] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1303.162382] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1303.162486] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1303.162608] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1303.162724] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1303.162843] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1303.162962] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1303.163108] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1303.163603] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1303.163771] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1303.163930] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1304.141517] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1304.141517] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1304.151609] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1304.151848] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1304.152027] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1304.152171] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1304.153299] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5477b9d0-c2e2-4c14-85e9-f9dc68b89d2c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1304.156114] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1304.156114] nova-compute[62208]: warnings.warn( [ 1304.162671] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20e88b14-05ff-42b1-a86f-5ba6420b289f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1304.166602] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1304.166602] nova-compute[62208]: warnings.warn( [ 1304.179923] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e92090b-1ea3-4174-940d-df25650a918a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1304.181754] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1304.181754] nova-compute[62208]: warnings.warn( [ 1304.186622] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4208db57-b569-454b-98bb-0e51a158493a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1304.189763] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1304.189763] nova-compute[62208]: warnings.warn( [ 1304.217416] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181979MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1304.217582] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1304.217785] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1304.288584] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1304.288742] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1304.288914] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1304.289053] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c4512476-9905-4f33-8575-d0a0f24ed4d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1304.289175] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1304.289295] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d5c7531e-b496-4aed-be05-f1a96391e327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1304.289415] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d1a22d6e-d913-47de-9188-507d2475f745 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1304.289532] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7f79eba6-e15c-4402-b46b-028d552a81d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1304.289648] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8f74dd97-b43c-49ef-8a83-401329ebfbdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1304.289761] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1304.301566] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c0d7e5a6-e905-47ee-87d7-cda8543be1f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1304.312114] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4d29dc3e-1090-49fd-83b7-96b8e6855ede has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1304.322648] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e45ad927-7d07-43d5-84b8-339c68981de6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1304.333641] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f000e638-100f-4a53-853d-4a94ffe71bed has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1304.344137] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bbb642f2-aa5c-4e71-b25b-e32acd45f879 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1304.355377] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1304.366313] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 795665a3-58eb-4d8a-bedc-84e399e11bb7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1304.376868] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance a72bbc63-b475-4be7-a412-f0f893a094f4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1304.388010] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b7855bc3-7f66-4755-b8bb-82604ae49df5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1304.388278] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1304.388429] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1304.648317] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76cb7aea-6861-4f64-a912-bb4ad1f90e80 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1304.650907] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1304.650907] nova-compute[62208]: warnings.warn( [ 1304.656254] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c75e72d5-5c4d-427c-af00-c31701e76caf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1304.659677] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1304.659677] nova-compute[62208]: warnings.warn( [ 1304.686079] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2fa2ae3-d4b1-473e-88a4-96b3ce7b0656 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1304.688582] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1304.688582] nova-compute[62208]: warnings.warn( [ 1304.693902] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4e49614-2e8e-4a9e-a5ee-575252cf9a04 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1304.698987] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1304.698987] nova-compute[62208]: warnings.warn( [ 1304.708746] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1304.717615] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1304.733885] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1304.734130] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.516s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1305.447900] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1305.447900] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1305.447900] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1305.447900] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1305.447900] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1305.447900] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1305.447900] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1305.447900] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1305.447900] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1305.447900] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1305.447900] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1305.447900] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1305.448552] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/1599910c-ea51-43b2-9e90-28937e98d1f8/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1305.450168] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1305.450413] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Copying Virtual Disk [datastore2] vmware_temp/1599910c-ea51-43b2-9e90-28937e98d1f8/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/1599910c-ea51-43b2-9e90-28937e98d1f8/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1305.450696] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6c783ff6-83ab-4c81-88c4-0a8eeb37fe90 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1305.452799] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1305.452799] nova-compute[62208]: warnings.warn( [ 1305.458802] nova-compute[62208]: DEBUG oslo_vmware.api [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Waiting for the task: (returnval){ [ 1305.458802] nova-compute[62208]: value = "task-38557" [ 1305.458802] nova-compute[62208]: _type = "Task" [ 1305.458802] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1305.462153] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1305.462153] nova-compute[62208]: warnings.warn( [ 1305.467238] nova-compute[62208]: DEBUG oslo_vmware.api [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Task: {'id': task-38557, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1305.735176] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1305.962713] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1305.962713] nova-compute[62208]: warnings.warn( [ 1305.969434] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1305.970119] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1305.970334] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Traceback (most recent call last): [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] yield resources [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] self.driver.spawn(context, instance, image_meta, [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] self._fetch_image_if_missing(context, vi) [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] image_cache(vi, tmp_image_ds_loc) [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] vm_util.copy_virtual_disk( [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] session._wait_for_task(vmdk_copy_task) [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] return self.wait_for_task(task_ref) [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] return evt.wait() [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] result = hub.switch() [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] return self.greenlet.switch() [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] self.f(*self.args, **self.kw) [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] raise exceptions.translate_fault(task_info.error) [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Faults: ['InvalidArgument'] [ 1305.970334] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] [ 1305.971118] nova-compute[62208]: INFO nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Terminating instance [ 1305.972275] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1305.972490] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1305.972734] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f388d194-56fb-413d-89b0-757e8bb5b2cb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1305.975263] nova-compute[62208]: DEBUG nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1305.975449] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1305.976298] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3aec2507-b463-4b71-a0d6-a4ef4eae469d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1305.978847] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1305.978847] nova-compute[62208]: warnings.warn( [ 1305.979383] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1305.979383] nova-compute[62208]: warnings.warn( [ 1305.983619] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1305.983863] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2e59486f-1dd6-4440-bcd1-7178938b109c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1305.986246] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1305.986417] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1305.986976] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1305.986976] nova-compute[62208]: warnings.warn( [ 1305.987370] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-292c2dbd-a645-4776-bdbd-691278c86003 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1305.989382] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1305.989382] nova-compute[62208]: warnings.warn( [ 1305.992510] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Waiting for the task: (returnval){ [ 1305.992510] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e701a5-50a5-993d-9a7d-02ad6e705119" [ 1305.992510] nova-compute[62208]: _type = "Task" [ 1305.992510] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1305.995703] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1305.995703] nova-compute[62208]: warnings.warn( [ 1306.000258] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e701a5-50a5-993d-9a7d-02ad6e705119, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1306.497568] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.497568] nova-compute[62208]: warnings.warn( [ 1306.503894] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1306.504175] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Creating directory with path [datastore2] vmware_temp/d3e8fdda-875c-4f19-9a26-03dcd02d3921/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1306.504414] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-14739dbd-5bde-423a-b887-aeb9911cf0c1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1306.506736] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.506736] nova-compute[62208]: warnings.warn( [ 1306.527652] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Created directory with path [datastore2] vmware_temp/d3e8fdda-875c-4f19-9a26-03dcd02d3921/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1306.527902] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Fetch image to [datastore2] vmware_temp/d3e8fdda-875c-4f19-9a26-03dcd02d3921/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1306.528105] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/d3e8fdda-875c-4f19-9a26-03dcd02d3921/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1306.529911] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a12fd787-0587-4155-b54d-a5c7e0b84128 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1306.532566] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1306.532763] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1306.532940] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Deleting the datastore file [datastore2] ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1306.533362] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-858e645b-7377-4888-9150-8a9e7e03cca9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1306.534803] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.534803] nova-compute[62208]: warnings.warn( [ 1306.535212] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.535212] nova-compute[62208]: warnings.warn( [ 1306.540673] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d1e0030-38dc-45b3-8327-2048d4e175e9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1306.543498] nova-compute[62208]: DEBUG oslo_vmware.api [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Waiting for the task: (returnval){ [ 1306.543498] nova-compute[62208]: value = "task-38559" [ 1306.543498] nova-compute[62208]: _type = "Task" [ 1306.543498] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1306.543757] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.543757] nova-compute[62208]: warnings.warn( [ 1306.550704] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.550704] nova-compute[62208]: warnings.warn( [ 1306.552304] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6b98f02-6a2a-423e-aa88-f4ba788d35e1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1306.558733] nova-compute[62208]: DEBUG oslo_vmware.api [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Task: {'id': task-38559, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1306.558867] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.558867] nova-compute[62208]: warnings.warn( [ 1306.589906] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca050f44-56c2-409f-bf28-e98ed0f43329 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1306.592394] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.592394] nova-compute[62208]: warnings.warn( [ 1306.596640] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-55df4f1f-4c42-41e1-aa82-55401e194957 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1306.598503] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.598503] nova-compute[62208]: warnings.warn( [ 1306.619564] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1306.737234] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Traceback (most recent call last): [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] result = getattr(controller, method)(*args, **kwargs) [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self._get(image_id) [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] resp, body = self.http_client.get(url, headers=header) [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 393, in get [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self.request(url, 'GET', **kwargs) [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self._handle_response(resp) [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] raise exc.from_response(resp, resp.content) [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] During handling of the above exception, another exception occurred: [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Traceback (most recent call last): [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] yield resources [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self.driver.spawn(context, instance, image_meta, [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._fetch_image_if_missing(context, vi) [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] image_fetch(context, vi, tmp_image_ds_loc) [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] images.fetch_image( [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1306.738908] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] metadata = IMAGE_API.get(context, image_ref) [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/image/glance.py", line 1206, in get [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return session.show(context, image_id, [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] _reraise_translated_image_exception(image_id) [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/image/glance.py", line 1032, in _reraise_translated_image_exception [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] raise new_exc.with_traceback(exc_trace) [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] result = getattr(controller, method)(*args, **kwargs) [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self._get(image_id) [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] resp, body = self.http_client.get(url, headers=header) [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 393, in get [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self.request(url, 'GET', **kwargs) [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self._handle_response(resp) [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] raise exc.from_response(resp, resp.content) [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] nova.exception.ImageNotAuthorized: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1306.740031] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1306.740031] nova-compute[62208]: INFO nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Terminating instance [ 1306.740839] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1306.741082] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1306.741358] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1e76dc3c-4e4b-42b9-bccd-198d5615ff02 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1306.743557] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.743557] nova-compute[62208]: warnings.warn( [ 1306.746606] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquiring lock "refresh_cache-9b3e4e69-4f4c-48f3-957e-0cfd979b5712" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1306.746763] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquired lock "refresh_cache-9b3e4e69-4f4c-48f3-957e-0cfd979b5712" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1306.746930] nova-compute[62208]: DEBUG nova.network.neutron [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1306.751363] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1306.751533] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1306.752321] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fc112776-f1d4-433b-b64c-2a80f11bcccc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1306.754613] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.754613] nova-compute[62208]: warnings.warn( [ 1306.759318] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Waiting for the task: (returnval){ [ 1306.759318] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52360fd6-e286-0a15-5e08-1a9cdac742b6" [ 1306.759318] nova-compute[62208]: _type = "Task" [ 1306.759318] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1306.763395] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.763395] nova-compute[62208]: warnings.warn( [ 1306.768843] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52360fd6-e286-0a15-5e08-1a9cdac742b6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1306.780991] nova-compute[62208]: DEBUG nova.network.neutron [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1306.806645] nova-compute[62208]: DEBUG nova.network.neutron [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1306.815877] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Releasing lock "refresh_cache-9b3e4e69-4f4c-48f3-957e-0cfd979b5712" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1306.816853] nova-compute[62208]: DEBUG nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1306.817060] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1306.818312] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccf50aa0-9875-4d29-8325-6fba1b21ba38 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1306.821807] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.821807] nova-compute[62208]: warnings.warn( [ 1306.827052] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1306.827303] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-92066e4f-2a8a-48f9-8ae9-d415437b246a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1306.828826] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.828826] nova-compute[62208]: warnings.warn( [ 1306.855320] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1306.855543] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1306.855730] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Deleting the datastore file [datastore2] 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1306.856050] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-69034bb9-ca8c-4d82-b480-7da15fdfa989 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1306.857835] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.857835] nova-compute[62208]: warnings.warn( [ 1306.863626] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Waiting for the task: (returnval){ [ 1306.863626] nova-compute[62208]: value = "task-38561" [ 1306.863626] nova-compute[62208]: _type = "Task" [ 1306.863626] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1306.867133] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1306.867133] nova-compute[62208]: warnings.warn( [ 1306.872240] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Task: {'id': task-38561, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1307.048605] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.048605] nova-compute[62208]: warnings.warn( [ 1307.054931] nova-compute[62208]: DEBUG oslo_vmware.api [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Task: {'id': task-38559, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.086533} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1307.055198] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1307.055379] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1307.055548] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1307.055755] nova-compute[62208]: INFO nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Took 1.08 seconds to destroy the instance on the hypervisor. [ 1307.058094] nova-compute[62208]: DEBUG nova.compute.claims [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936877820> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1307.058293] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1307.058518] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1307.140685] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1307.140856] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1307.264342] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.264342] nova-compute[62208]: warnings.warn( [ 1307.273178] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1307.273427] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Creating directory with path [datastore2] vmware_temp/ae186e34-5677-439a-975f-a31166a3c86c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1307.273972] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f4897f93-1eb2-4c66-a62c-1feb8ca6ed3e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.275645] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.275645] nova-compute[62208]: warnings.warn( [ 1307.289567] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Created directory with path [datastore2] vmware_temp/ae186e34-5677-439a-975f-a31166a3c86c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1307.289847] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Fetch image to [datastore2] vmware_temp/ae186e34-5677-439a-975f-a31166a3c86c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1307.290027] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/ae186e34-5677-439a-975f-a31166a3c86c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1307.290846] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61f8f673-ce61-466a-ae04-7c321e95381b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.293231] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.293231] nova-compute[62208]: warnings.warn( [ 1307.300757] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50e3c0f5-3213-429d-9e7f-50b64b8c188d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.303101] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.303101] nova-compute[62208]: warnings.warn( [ 1307.310336] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66a5c088-bc5f-4991-a65c-202beac80c82 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.316966] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.316966] nova-compute[62208]: warnings.warn( [ 1307.348086] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ffee346-6626-4854-9910-5e72377e6d5c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.350650] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.350650] nova-compute[62208]: warnings.warn( [ 1307.354735] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8e846ea8-ddaf-4b7c-9d75-a35e85157092 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.356368] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.356368] nova-compute[62208]: warnings.warn( [ 1307.369634] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.369634] nova-compute[62208]: warnings.warn( [ 1307.376165] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Task: {'id': task-38561, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.039277} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1307.378543] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1307.378744] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1307.378921] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1307.379101] nova-compute[62208]: INFO nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1307.379355] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1307.380123] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fd233a5-5a1c-4312-91e5-2ea638f817bd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.382899] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1307.384870] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1307.384977] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1307.387546] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.387546] nova-compute[62208]: warnings.warn( [ 1307.392624] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0edca45-b348-491a-90e5-50fb9b71d2c2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.395461] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.395461] nova-compute[62208]: warnings.warn( [ 1307.424910] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-375c18e4-c651-4be3-8b5a-fcd3ef93b19a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.427704] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.427704] nova-compute[62208]: warnings.warn( [ 1307.433339] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e761c68-f502-4a68-a2e0-d773ac3e5057 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.438170] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.438170] nova-compute[62208]: warnings.warn( [ 1307.447839] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1307.456472] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1307.475718] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.417s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1307.476263] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Traceback (most recent call last): [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] self.driver.spawn(context, instance, image_meta, [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] self._fetch_image_if_missing(context, vi) [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] image_cache(vi, tmp_image_ds_loc) [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] vm_util.copy_virtual_disk( [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] session._wait_for_task(vmdk_copy_task) [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] return self.wait_for_task(task_ref) [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] return evt.wait() [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] result = hub.switch() [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] return self.greenlet.switch() [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] self.f(*self.args, **self.kw) [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] raise exceptions.translate_fault(task_info.error) [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Faults: ['InvalidArgument'] [ 1307.476263] nova-compute[62208]: ERROR nova.compute.manager [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] [ 1307.477058] nova-compute[62208]: DEBUG nova.compute.utils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1307.481604] nova-compute[62208]: DEBUG nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Build of instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 was re-scheduled: A specified parameter was not correct: fileType [ 1307.481604] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1307.481993] nova-compute[62208]: DEBUG nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1307.482166] nova-compute[62208]: DEBUG nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1307.482332] nova-compute[62208]: DEBUG nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1307.482494] nova-compute[62208]: DEBUG nova.network.neutron [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1307.494571] nova-compute[62208]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62208) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1307.494923] nova-compute[62208]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.<locals>._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-4753f1f0-530c-48dc-af73-48575940c25e'] [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1307.497079] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1307.498978] nova-compute[62208]: ERROR nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1307.520349] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ae186e34-5677-439a-975f-a31166a3c86c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1307.577985] nova-compute[62208]: WARNING nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Could not clean up failed build, not rescheduling. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1307.578262] nova-compute[62208]: DEBUG nova.compute.claims [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Aborting claim: <nova.compute.claims.Claim object at 0x7fb93691eaa0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1307.578443] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1307.578750] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1307.585659] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1307.586000] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ae186e34-5677-439a-975f-a31166a3c86c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1307.889805] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a77cd98-7f73-454d-b53e-006d554ddbd3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.892354] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.892354] nova-compute[62208]: warnings.warn( [ 1307.902129] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d1b98f8-4286-4b7f-88ec-703c5be83cb9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.902129] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.902129] nova-compute[62208]: warnings.warn( [ 1307.930760] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dd1ba56-0736-4dee-bac1-a034072622a8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.933465] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.933465] nova-compute[62208]: warnings.warn( [ 1307.939216] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5faf6f1-64c1-468e-b6c2-1eb3dec2e2df {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.956630] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1307.956630] nova-compute[62208]: warnings.warn( [ 1307.971040] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1307.987114] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1307.995090] nova-compute[62208]: DEBUG nova.network.neutron [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1308.007323] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.428s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1308.007575] nova-compute[62208]: DEBUG nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Build of instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2661}} [ 1308.008353] nova-compute[62208]: DEBUG nova.compute.utils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Build of instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1308.010039] nova-compute[62208]: ERROR nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Build of instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7.: nova.exception.BuildAbortException: Build of instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1308.010234] nova-compute[62208]: DEBUG nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1308.010487] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquiring lock "refresh_cache-9b3e4e69-4f4c-48f3-957e-0cfd979b5712" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1308.010641] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquired lock "refresh_cache-9b3e4e69-4f4c-48f3-957e-0cfd979b5712" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1308.010802] nova-compute[62208]: DEBUG nova.network.neutron [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1308.012541] nova-compute[62208]: INFO nova.compute.manager [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Took 0.53 seconds to deallocate network for instance. [ 1308.040024] nova-compute[62208]: DEBUG nova.network.neutron [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1308.068480] nova-compute[62208]: DEBUG nova.network.neutron [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1308.078105] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Releasing lock "refresh_cache-9b3e4e69-4f4c-48f3-957e-0cfd979b5712" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1308.078337] nova-compute[62208]: DEBUG nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1308.078505] nova-compute[62208]: DEBUG nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1308.078669] nova-compute[62208]: DEBUG nova.network.neutron [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1308.120020] nova-compute[62208]: INFO nova.scheduler.client.report [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Deleted allocations for instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 [ 1308.142275] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-41ba88d0-cc22-4deb-b6a6-69c9a3167c8f tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 683.098s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1308.144592] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1840cfda-292f-4d33-a376-7409cb1dda14 tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 485.973s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1308.145093] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1840cfda-292f-4d33-a376-7409cb1dda14 tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Acquiring lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1308.145510] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1840cfda-292f-4d33-a376-7409cb1dda14 tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1308.146201] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1840cfda-292f-4d33-a376-7409cb1dda14 tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1308.148677] nova-compute[62208]: INFO nova.compute.manager [None req-1840cfda-292f-4d33-a376-7409cb1dda14 tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Terminating instance [ 1308.151274] nova-compute[62208]: DEBUG nova.compute.manager [None req-1840cfda-292f-4d33-a376-7409cb1dda14 tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1308.151682] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-1840cfda-292f-4d33-a376-7409cb1dda14 tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1308.152380] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-647849f4-6180-4c6a-9f5a-85cabe51390d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.157363] nova-compute[62208]: DEBUG nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1308.159791] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1308.159791] nova-compute[62208]: warnings.warn( [ 1308.167328] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38e5ff57-3b2b-4cc5-aa12-27002aa914c5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.179168] nova-compute[62208]: DEBUG neutronclient.v2_0.client [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62208) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Traceback (most recent call last): [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] exception_handler_v20(status_code, error_body) [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] raise client_exc(message=error_message, [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Neutron server returns request_ids: ['req-4753f1f0-530c-48dc-af73-48575940c25e'] [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] During handling of the above exception, another exception occurred: [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Traceback (most recent call last): [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 2902, in _build_resources [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._shutdown_instance(context, instance, [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 3161, in _shutdown_instance [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._try_deallocate_network(context, instance, requested_networks) [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 3075, in _try_deallocate_network [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] with excutils.save_and_reraise_exception(): [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self.force_reraise() [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] raise self.value [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 3073, in _try_deallocate_network [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] _deallocate_network_with_retries() [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return evt.wait() [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] result = hub.switch() [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self.greenlet.switch() [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] result = func(*self.args, **self.kw) [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] result = f(*args, **kwargs) [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1308.180698] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._deallocate_network( [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self.network_api.deallocate_for_instance( [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] data = neutron.list_ports(**search_opts) [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self.list('ports', self.ports_path, retrieve_all, [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] for r in self._pagination(collection, path, **params): [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] res = self.get(path, params=params) [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self.retry_request("GET", action, body=body, [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self.do_request(method, action, body=body, [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._handle_fault_response(status_code, replybody, resp) [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] During handling of the above exception, another exception occurred: [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Traceback (most recent call last): [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 2447, in _do_build_and_run_instance [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._build_and_run_instance(context, instance, image, [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 2660, in _build_and_run_instance [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] with excutils.save_and_reraise_exception(): [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1308.182042] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self.force_reraise() [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] raise self.value [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] with self._build_resources(context, instance, [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__ [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self.gen.throw(typ, value, traceback) [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 2910, in _build_resources [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] raise exception.BuildAbortException( [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] nova.exception.BuildAbortException: Build of instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] During handling of the above exception, another exception occurred: [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Traceback (most recent call last): [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] exception_handler_v20(status_code, error_body) [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] raise client_exc(message=error_message, [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Neutron server returns request_ids: ['req-56b59af5-8bf9-447d-ba15-36ca09323f3f'] [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] During handling of the above exception, another exception occurred: [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Traceback (most recent call last): [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 3036, in _cleanup_allocated_networks [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._deallocate_network(context, instance, requested_networks) [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self.network_api.deallocate_for_instance( [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] data = neutron.list_ports(**search_opts) [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self.list('ports', self.ports_path, retrieve_all, [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1308.183172] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] for r in self._pagination(collection, path, **params): [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] res = self.get(path, params=params) [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self.retry_request("GET", action, body=body, [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self.do_request(method, action, body=body, [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._handle_fault_response(status_code, replybody, resp) [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] raise exception.Unauthorized() [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] nova.exception.Unauthorized: Not authorized. [ 1308.184200] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1308.184200] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1308.184200] nova-compute[62208]: warnings.warn( [ 1308.200512] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-1840cfda-292f-4d33-a376-7409cb1dda14 tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841 could not be found. [ 1308.200726] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-1840cfda-292f-4d33-a376-7409cb1dda14 tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1308.200907] nova-compute[62208]: INFO nova.compute.manager [None req-1840cfda-292f-4d33-a376-7409cb1dda14 tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1308.201161] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-1840cfda-292f-4d33-a376-7409cb1dda14 tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1308.202003] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1308.202107] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1308.221720] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1308.221963] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1308.223414] nova-compute[62208]: INFO nova.compute.claims [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1308.238460] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1308.258155] nova-compute[62208]: INFO nova.compute.manager [-] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] Took 0.06 seconds to deallocate network for instance. [ 1308.275085] nova-compute[62208]: INFO nova.scheduler.client.report [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Deleted allocations for instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 [ 1308.275385] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7dcf4707-0531-4f64-be21-a72cfe8a1211 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Lock "9b3e4e69-4f4c-48f3-957e-0cfd979b5712" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 638.571s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1308.276665] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Lock "9b3e4e69-4f4c-48f3-957e-0cfd979b5712" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 442.218s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1308.276800] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquiring lock "9b3e4e69-4f4c-48f3-957e-0cfd979b5712-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1308.277002] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Lock "9b3e4e69-4f4c-48f3-957e-0cfd979b5712-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1308.277190] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Lock "9b3e4e69-4f4c-48f3-957e-0cfd979b5712-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1308.279335] nova-compute[62208]: INFO nova.compute.manager [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Terminating instance [ 1308.283838] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquiring lock "refresh_cache-9b3e4e69-4f4c-48f3-957e-0cfd979b5712" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1308.283988] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Acquired lock "refresh_cache-9b3e4e69-4f4c-48f3-957e-0cfd979b5712" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1308.284181] nova-compute[62208]: DEBUG nova.network.neutron [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1308.305327] nova-compute[62208]: DEBUG nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1308.317742] nova-compute[62208]: DEBUG nova.network.neutron [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1308.374984] nova-compute[62208]: DEBUG nova.network.neutron [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1308.382461] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1840cfda-292f-4d33-a376-7409cb1dda14 tempest-InstanceActionsV221TestJSON-973339105 tempest-InstanceActionsV221TestJSON-973339105-project-member] Lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.238s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1308.383880] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 133.962s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1308.384140] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841] During sync_power_state the instance has a pending task (deleting). Skip. [ 1308.384417] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "ebfeff1b-6ae0-4d68-83fb-7dbc60e2a841" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1308.385541] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Releasing lock "refresh_cache-9b3e4e69-4f4c-48f3-957e-0cfd979b5712" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1308.385980] nova-compute[62208]: DEBUG nova.compute.manager [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1308.387330] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1308.387957] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e4a6f770-d51c-4743-bcce-2ba628270a4f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.390444] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1308.390444] nova-compute[62208]: warnings.warn( [ 1308.401416] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6d35971-89d8-454e-8123-c86bb03bb761 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.418466] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1308.418466] nova-compute[62208]: warnings.warn( [ 1308.436939] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9b3e4e69-4f4c-48f3-957e-0cfd979b5712 could not be found. [ 1308.436939] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1308.437129] nova-compute[62208]: INFO nova.compute.manager [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1308.437357] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1308.438357] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1308.438590] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1308.438687] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1308.524612] nova-compute[62208]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62208) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1308.524886] nova-compute[62208]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.<locals>._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-8fc6f66c-dc31-4659-93fd-1c199db44548'] [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1308.525441] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1308.526886] nova-compute[62208]: ERROR nova.compute.manager [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Traceback (most recent call last): [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] exception_handler_v20(status_code, error_body) [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] raise client_exc(message=error_message, [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Neutron server returns request_ids: ['req-8fc6f66c-dc31-4659-93fd-1c199db44548'] [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] During handling of the above exception, another exception occurred: [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Traceback (most recent call last): [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 3332, in do_terminate_instance [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._delete_instance(context, instance, bdms) [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 3267, in _delete_instance [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._shutdown_instance(context, instance, bdms) [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 3161, in _shutdown_instance [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._try_deallocate_network(context, instance, requested_networks) [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 3075, in _try_deallocate_network [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] with excutils.save_and_reraise_exception(): [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self.force_reraise() [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] raise self.value [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 3073, in _try_deallocate_network [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] _deallocate_network_with_retries() [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return evt.wait() [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] result = hub.switch() [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self.greenlet.switch() [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] result = func(*self.args, **self.kw) [ 1308.556614] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] result = f(*args, **kwargs) [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._deallocate_network( [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self.network_api.deallocate_for_instance( [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] data = neutron.list_ports(**search_opts) [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self.list('ports', self.ports_path, retrieve_all, [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] for r in self._pagination(collection, path, **params): [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] res = self.get(path, params=params) [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self.retry_request("GET", action, body=body, [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] return self.do_request(method, action, body=body, [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] ret = obj(*args, **kwargs) [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] self._handle_fault_response(status_code, replybody, resp) [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1308.557704] nova-compute[62208]: ERROR nova.compute.manager [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] [ 1308.564743] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50677fd0-07c0-44bb-be47-bd0827b71ed8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.567899] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1308.567899] nova-compute[62208]: warnings.warn( [ 1308.573093] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25935d79-5ee6-4199-a4c8-abcb8a025e70 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.576418] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1308.576418] nova-compute[62208]: warnings.warn( [ 1308.605192] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Lock "9b3e4e69-4f4c-48f3-957e-0cfd979b5712" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.328s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1308.606815] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f12e9f44-354c-4ecd-a34e-cbb76d6c3494 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.610848] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "9b3e4e69-4f4c-48f3-957e-0cfd979b5712" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 134.189s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1308.611052] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] During sync_power_state the instance has a pending task (deleting). Skip. [ 1308.611227] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "9b3e4e69-4f4c-48f3-957e-0cfd979b5712" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1308.611623] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1308.611623] nova-compute[62208]: warnings.warn( [ 1308.618008] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18635d4f-6ac3-4bb7-a00b-7a6faffa1231 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.626622] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1308.626622] nova-compute[62208]: warnings.warn( [ 1308.639014] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1308.649587] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1308.672990] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.450s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1308.672990] nova-compute[62208]: DEBUG nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1308.677718] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.239s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1308.679158] nova-compute[62208]: INFO nova.compute.claims [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1308.714182] nova-compute[62208]: INFO nova.compute.manager [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] [instance: 9b3e4e69-4f4c-48f3-957e-0cfd979b5712] Successfully reverted task state from None on failure for instance. [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server [None req-cc994287-229e-4bd0-9fda-49e9ac51cfa6 tempest-ServerExternalEventsTest-960011189 tempest-ServerExternalEventsTest-960011189-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-8fc6f66c-dc31-4659-93fd-1c199db44548'] [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1453, in decorated_function [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3344, in terminate_instance [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3339, in do_terminate_instance [ 1308.718795] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3332, in do_terminate_instance [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3267, in _delete_instance [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3161, in _shutdown_instance [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3075, in _try_deallocate_network [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3073, in _try_deallocate_network [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1308.720311] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1308.722275] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1308.722275] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1308.722275] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1308.722275] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1308.722275] nova-compute[62208]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1308.722275] nova-compute[62208]: ERROR oslo_messaging.rpc.server [ 1308.737966] nova-compute[62208]: DEBUG nova.compute.utils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1308.739628] nova-compute[62208]: DEBUG nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1308.739799] nova-compute[62208]: DEBUG nova.network.neutron [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1308.754489] nova-compute[62208]: DEBUG nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1308.787650] nova-compute[62208]: DEBUG nova.policy [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7534a5a8a37e4451918e35c8b93d4ad5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8eef1e68dea42cf98f03dc8db29498a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1308.833065] nova-compute[62208]: DEBUG nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1308.856194] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1308.856443] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1308.856603] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1308.856782] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1308.856930] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1308.857093] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1308.857299] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1308.857458] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1308.857638] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1308.857872] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1308.858070] nova-compute[62208]: DEBUG nova.virt.hardware [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1308.859101] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e48d82f2-16bb-43d2-8084-bdce29efa5e4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.863821] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1308.863821] nova-compute[62208]: warnings.warn( [ 1308.869809] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a4781ab-f291-41e0-b5f8-6166b6cbd8b1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.873818] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1308.873818] nova-compute[62208]: warnings.warn( [ 1308.998747] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8020c6f6-e8de-4d20-befa-f5f58b7eb702 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.001393] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1309.001393] nova-compute[62208]: warnings.warn( [ 1309.007150] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e374565-1836-4111-b666-e575f4742c4e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.010748] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1309.010748] nova-compute[62208]: warnings.warn( [ 1309.062548] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b7d9203-15b9-4253-8d68-79ba2cbc7fda {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.066674] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1309.066674] nova-compute[62208]: warnings.warn( [ 1309.074866] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d40faaa2-0a43-4af1-a73c-79f788c529cc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.081849] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1309.081849] nova-compute[62208]: warnings.warn( [ 1309.100367] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1309.110183] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1309.126949] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.449s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1309.127451] nova-compute[62208]: DEBUG nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1309.139385] nova-compute[62208]: DEBUG nova.network.neutron [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Successfully created port: 895e98c2-3051-483b-9a3e-c58652a5b46f {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1309.169394] nova-compute[62208]: DEBUG nova.compute.utils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1309.170982] nova-compute[62208]: DEBUG nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1309.171329] nova-compute[62208]: DEBUG nova.network.neutron [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1309.186220] nova-compute[62208]: DEBUG nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1309.256757] nova-compute[62208]: DEBUG nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1309.278692] nova-compute[62208]: DEBUG nova.virt.hardware [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1309.279472] nova-compute[62208]: DEBUG nova.virt.hardware [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1309.279756] nova-compute[62208]: DEBUG nova.virt.hardware [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1309.280248] nova-compute[62208]: DEBUG nova.virt.hardware [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1309.280624] nova-compute[62208]: DEBUG nova.virt.hardware [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1309.280883] nova-compute[62208]: DEBUG nova.virt.hardware [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1309.281202] nova-compute[62208]: DEBUG nova.virt.hardware [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1309.281536] nova-compute[62208]: DEBUG nova.virt.hardware [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1309.281809] nova-compute[62208]: DEBUG nova.virt.hardware [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1309.282091] nova-compute[62208]: DEBUG nova.virt.hardware [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1309.282372] nova-compute[62208]: DEBUG nova.virt.hardware [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1309.283354] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfe95186-7735-4b68-9295-6094c3b0a617 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.286073] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1309.286073] nova-compute[62208]: warnings.warn( [ 1309.292118] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bc0f3e9-2cee-4797-8d2d-1558ec0bccde {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.297046] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1309.297046] nova-compute[62208]: warnings.warn( [ 1309.419170] nova-compute[62208]: DEBUG nova.policy [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6b6f55f71a546a1b0e123b97244f260', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '544d3e5a19da441896ee085efafec48e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1309.934446] nova-compute[62208]: DEBUG nova.network.neutron [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Successfully created port: cf31e87a-32f9-480e-81ed-3c91152548dd {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1310.352747] nova-compute[62208]: DEBUG nova.compute.manager [req-0f1d56ac-a8d0-4df3-a9e1-ca7f3a87ddc2 req-a3db6e11-72b6-4938-a503-65f3e696e5ff service nova] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Received event network-vif-plugged-895e98c2-3051-483b-9a3e-c58652a5b46f {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1310.352976] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-0f1d56ac-a8d0-4df3-a9e1-ca7f3a87ddc2 req-a3db6e11-72b6-4938-a503-65f3e696e5ff service nova] Acquiring lock "c0d7e5a6-e905-47ee-87d7-cda8543be1f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1310.353185] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-0f1d56ac-a8d0-4df3-a9e1-ca7f3a87ddc2 req-a3db6e11-72b6-4938-a503-65f3e696e5ff service nova] Lock "c0d7e5a6-e905-47ee-87d7-cda8543be1f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1310.353353] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-0f1d56ac-a8d0-4df3-a9e1-ca7f3a87ddc2 req-a3db6e11-72b6-4938-a503-65f3e696e5ff service nova] Lock "c0d7e5a6-e905-47ee-87d7-cda8543be1f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1310.353514] nova-compute[62208]: DEBUG nova.compute.manager [req-0f1d56ac-a8d0-4df3-a9e1-ca7f3a87ddc2 req-a3db6e11-72b6-4938-a503-65f3e696e5ff service nova] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] No waiting events found dispatching network-vif-plugged-895e98c2-3051-483b-9a3e-c58652a5b46f {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1310.353672] nova-compute[62208]: WARNING nova.compute.manager [req-0f1d56ac-a8d0-4df3-a9e1-ca7f3a87ddc2 req-a3db6e11-72b6-4938-a503-65f3e696e5ff service nova] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Received unexpected event network-vif-plugged-895e98c2-3051-483b-9a3e-c58652a5b46f for instance with vm_state building and task_state spawning. [ 1310.471962] nova-compute[62208]: DEBUG nova.network.neutron [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Successfully updated port: 895e98c2-3051-483b-9a3e-c58652a5b46f {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1310.483831] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "refresh_cache-c0d7e5a6-e905-47ee-87d7-cda8543be1f2" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1310.483995] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "refresh_cache-c0d7e5a6-e905-47ee-87d7-cda8543be1f2" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1310.484145] nova-compute[62208]: DEBUG nova.network.neutron [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1310.550751] nova-compute[62208]: DEBUG nova.network.neutron [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1310.937502] nova-compute[62208]: DEBUG nova.network.neutron [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Updating instance_info_cache with network_info: [{"id": "895e98c2-3051-483b-9a3e-c58652a5b46f", "address": "fa:16:3e:c0:9d:b3", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap895e98c2-30", "ovs_interfaceid": "895e98c2-3051-483b-9a3e-c58652a5b46f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1310.952316] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "refresh_cache-c0d7e5a6-e905-47ee-87d7-cda8543be1f2" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1310.952519] nova-compute[62208]: DEBUG nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Instance network_info: |[{"id": "895e98c2-3051-483b-9a3e-c58652a5b46f", "address": "fa:16:3e:c0:9d:b3", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap895e98c2-30", "ovs_interfaceid": "895e98c2-3051-483b-9a3e-c58652a5b46f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1310.952896] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c0:9d:b3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'da623279-b6f6-4570-8b15-a332120b8b60', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '895e98c2-3051-483b-9a3e-c58652a5b46f', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1310.960281] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating folder: Project (e8eef1e68dea42cf98f03dc8db29498a). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1310.961235] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7016cf55-390c-431b-8ceb-1cef0bc5afaf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1310.962982] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1310.962982] nova-compute[62208]: warnings.warn( [ 1310.973185] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Created folder: Project (e8eef1e68dea42cf98f03dc8db29498a) in parent group-v17427. [ 1310.973395] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating folder: Instances. Parent ref: group-v17528. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1310.973643] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-43d26f55-c273-4974-8bba-4bb7061de6d1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1310.975255] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1310.975255] nova-compute[62208]: warnings.warn( [ 1310.983820] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Created folder: Instances in parent group-v17528. [ 1310.984173] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1310.984438] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1310.984601] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0c99a1f1-21be-46b5-8527-b0548e133f68 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1310.999459] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1310.999459] nova-compute[62208]: warnings.warn( [ 1311.005560] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1311.005560] nova-compute[62208]: value = "task-38564" [ 1311.005560] nova-compute[62208]: _type = "Task" [ 1311.005560] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1311.011271] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1311.011271] nova-compute[62208]: warnings.warn( [ 1311.016494] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38564, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1311.017856] nova-compute[62208]: DEBUG nova.network.neutron [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Successfully updated port: cf31e87a-32f9-480e-81ed-3c91152548dd {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1311.035304] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Acquiring lock "refresh_cache-4d29dc3e-1090-49fd-83b7-96b8e6855ede" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1311.035519] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Acquired lock "refresh_cache-4d29dc3e-1090-49fd-83b7-96b8e6855ede" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1311.035752] nova-compute[62208]: DEBUG nova.network.neutron [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1311.088759] nova-compute[62208]: DEBUG nova.network.neutron [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1311.263279] nova-compute[62208]: DEBUG nova.network.neutron [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Updating instance_info_cache with network_info: [{"id": "cf31e87a-32f9-480e-81ed-3c91152548dd", "address": "fa:16:3e:b2:86:31", "network": {"id": "77389d97-3c0a-4197-912d-b47d6e3ec728", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-747230804-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "544d3e5a19da441896ee085efafec48e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "30c39e9a-a798-4f25-a48c-91f786ba332c", "external-id": "nsx-vlan-transportzone-438", "segmentation_id": 438, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcf31e87a-32", "ovs_interfaceid": "cf31e87a-32f9-480e-81ed-3c91152548dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1311.279506] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Releasing lock "refresh_cache-4d29dc3e-1090-49fd-83b7-96b8e6855ede" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1311.279835] nova-compute[62208]: DEBUG nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Instance network_info: |[{"id": "cf31e87a-32f9-480e-81ed-3c91152548dd", "address": "fa:16:3e:b2:86:31", "network": {"id": "77389d97-3c0a-4197-912d-b47d6e3ec728", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-747230804-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "544d3e5a19da441896ee085efafec48e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "30c39e9a-a798-4f25-a48c-91f786ba332c", "external-id": "nsx-vlan-transportzone-438", "segmentation_id": 438, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcf31e87a-32", "ovs_interfaceid": "cf31e87a-32f9-480e-81ed-3c91152548dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1311.280326] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b2:86:31', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '30c39e9a-a798-4f25-a48c-91f786ba332c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cf31e87a-32f9-480e-81ed-3c91152548dd', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1311.287837] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Creating folder: Project (544d3e5a19da441896ee085efafec48e). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1311.288480] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3cf6c308-6655-4882-86a9-46cf3bc8ffb0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1311.290262] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1311.290262] nova-compute[62208]: warnings.warn( [ 1311.300014] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Created folder: Project (544d3e5a19da441896ee085efafec48e) in parent group-v17427. [ 1311.300339] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Creating folder: Instances. Parent ref: group-v17531. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1311.300637] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d6a687da-61d7-4f32-96d8-d81466a1792b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1311.302793] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1311.302793] nova-compute[62208]: warnings.warn( [ 1311.313371] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Created folder: Instances in parent group-v17531. [ 1311.313671] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1311.313875] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1311.314085] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a71fac3f-98bf-40a9-a4b5-7cde96af17ce {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1311.328973] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1311.328973] nova-compute[62208]: warnings.warn( [ 1311.335735] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1311.335735] nova-compute[62208]: value = "task-38567" [ 1311.335735] nova-compute[62208]: _type = "Task" [ 1311.335735] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1311.344161] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1311.344161] nova-compute[62208]: warnings.warn( [ 1311.344161] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38567, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1311.510180] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1311.510180] nova-compute[62208]: warnings.warn( [ 1311.516257] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38564, 'name': CreateVM_Task} progress is 25%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1311.838431] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1311.838431] nova-compute[62208]: warnings.warn( [ 1311.845603] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38567, 'name': CreateVM_Task, 'duration_secs': 0.318702} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1311.845811] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1311.846393] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1311.846613] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1311.849442] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05eb462c-133a-45d0-8980-8642dbd07717 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1311.859611] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1311.859611] nova-compute[62208]: warnings.warn( [ 1311.882620] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Reconfiguring VM instance to enable vnc on port - 5904 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1311.883016] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-602e16dd-a2fc-4f91-88ea-2c08aa1b386d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1311.893525] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1311.893525] nova-compute[62208]: warnings.warn( [ 1311.899322] nova-compute[62208]: DEBUG oslo_vmware.api [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Waiting for the task: (returnval){ [ 1311.899322] nova-compute[62208]: value = "task-38568" [ 1311.899322] nova-compute[62208]: _type = "Task" [ 1311.899322] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1311.902758] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1311.902758] nova-compute[62208]: warnings.warn( [ 1311.910037] nova-compute[62208]: DEBUG oslo_vmware.api [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Task: {'id': task-38568, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1312.010035] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1312.010035] nova-compute[62208]: warnings.warn( [ 1312.016115] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38564, 'name': CreateVM_Task} progress is 25%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1312.393145] nova-compute[62208]: DEBUG nova.compute.manager [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Received event network-changed-895e98c2-3051-483b-9a3e-c58652a5b46f {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1312.393439] nova-compute[62208]: DEBUG nova.compute.manager [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Refreshing instance network info cache due to event network-changed-895e98c2-3051-483b-9a3e-c58652a5b46f. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1312.393755] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] Acquiring lock "refresh_cache-c0d7e5a6-e905-47ee-87d7-cda8543be1f2" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1312.394032] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] Acquired lock "refresh_cache-c0d7e5a6-e905-47ee-87d7-cda8543be1f2" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1312.394208] nova-compute[62208]: DEBUG nova.network.neutron [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Refreshing network info cache for port 895e98c2-3051-483b-9a3e-c58652a5b46f {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1312.403217] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1312.403217] nova-compute[62208]: warnings.warn( [ 1312.412193] nova-compute[62208]: DEBUG oslo_vmware.api [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Task: {'id': task-38568, 'name': ReconfigVM_Task, 'duration_secs': 0.134065} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1312.412470] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Reconfigured VM instance to enable vnc on port - 5904 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1312.412674] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.566s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1312.412909] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1312.413048] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1312.413356] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1312.413641] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-abd848da-97bc-43ee-971c-9ef556d4bcd1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1312.415374] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1312.415374] nova-compute[62208]: warnings.warn( [ 1312.419792] nova-compute[62208]: DEBUG oslo_vmware.api [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Waiting for the task: (returnval){ [ 1312.419792] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52826e70-5852-6610-83bf-a4ebb2e971c1" [ 1312.419792] nova-compute[62208]: _type = "Task" [ 1312.419792] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1312.425207] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1312.425207] nova-compute[62208]: warnings.warn( [ 1312.430640] nova-compute[62208]: DEBUG oslo_vmware.api [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52826e70-5852-6610-83bf-a4ebb2e971c1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1312.511319] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1312.511319] nova-compute[62208]: warnings.warn( [ 1312.516906] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38564, 'name': CreateVM_Task} progress is 25%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1312.798934] nova-compute[62208]: DEBUG nova.network.neutron [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Updated VIF entry in instance network info cache for port 895e98c2-3051-483b-9a3e-c58652a5b46f. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1312.799155] nova-compute[62208]: DEBUG nova.network.neutron [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Updating instance_info_cache with network_info: [{"id": "895e98c2-3051-483b-9a3e-c58652a5b46f", "address": "fa:16:3e:c0:9d:b3", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap895e98c2-30", "ovs_interfaceid": "895e98c2-3051-483b-9a3e-c58652a5b46f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1312.813970] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] Releasing lock "refresh_cache-c0d7e5a6-e905-47ee-87d7-cda8543be1f2" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1312.814698] nova-compute[62208]: DEBUG nova.compute.manager [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Received event network-vif-plugged-cf31e87a-32f9-480e-81ed-3c91152548dd {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1312.814698] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] Acquiring lock "4d29dc3e-1090-49fd-83b7-96b8e6855ede-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1312.814698] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] Lock "4d29dc3e-1090-49fd-83b7-96b8e6855ede-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1312.814865] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] Lock "4d29dc3e-1090-49fd-83b7-96b8e6855ede-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1312.814915] nova-compute[62208]: DEBUG nova.compute.manager [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] No waiting events found dispatching network-vif-plugged-cf31e87a-32f9-480e-81ed-3c91152548dd {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1312.815254] nova-compute[62208]: WARNING nova.compute.manager [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Received unexpected event network-vif-plugged-cf31e87a-32f9-480e-81ed-3c91152548dd for instance with vm_state building and task_state spawning. [ 1312.815254] nova-compute[62208]: DEBUG nova.compute.manager [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Received event network-changed-cf31e87a-32f9-480e-81ed-3c91152548dd {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1312.815372] nova-compute[62208]: DEBUG nova.compute.manager [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Refreshing instance network info cache due to event network-changed-cf31e87a-32f9-480e-81ed-3c91152548dd. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1312.815548] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] Acquiring lock "refresh_cache-4d29dc3e-1090-49fd-83b7-96b8e6855ede" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1312.816444] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] Acquired lock "refresh_cache-4d29dc3e-1090-49fd-83b7-96b8e6855ede" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1312.816444] nova-compute[62208]: DEBUG nova.network.neutron [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Refreshing network info cache for port cf31e87a-32f9-480e-81ed-3c91152548dd {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1312.923674] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1312.923674] nova-compute[62208]: warnings.warn( [ 1312.930142] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1312.930385] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1312.930591] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1313.021898] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1313.021898] nova-compute[62208]: warnings.warn( [ 1313.029468] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38564, 'name': CreateVM_Task, 'duration_secs': 1.680529} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1313.029468] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1313.029961] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1313.030146] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1313.032984] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-116c892e-3a71-423d-aff2-1213cfb92603 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.049723] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1313.049723] nova-compute[62208]: warnings.warn( [ 1313.081081] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Reconfiguring VM instance to enable vnc on port - 5905 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1313.081081] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-742934b4-cc71-4d07-8e35-5e10999e3ba9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.091718] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1313.091718] nova-compute[62208]: warnings.warn( [ 1313.097938] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 1313.097938] nova-compute[62208]: value = "task-38569" [ 1313.097938] nova-compute[62208]: _type = "Task" [ 1313.097938] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1313.101354] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1313.101354] nova-compute[62208]: warnings.warn( [ 1313.107029] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38569, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1313.224549] nova-compute[62208]: DEBUG nova.network.neutron [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Updated VIF entry in instance network info cache for port cf31e87a-32f9-480e-81ed-3c91152548dd. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1313.225173] nova-compute[62208]: DEBUG nova.network.neutron [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Updating instance_info_cache with network_info: [{"id": "cf31e87a-32f9-480e-81ed-3c91152548dd", "address": "fa:16:3e:b2:86:31", "network": {"id": "77389d97-3c0a-4197-912d-b47d6e3ec728", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-747230804-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "544d3e5a19da441896ee085efafec48e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "30c39e9a-a798-4f25-a48c-91f786ba332c", "external-id": "nsx-vlan-transportzone-438", "segmentation_id": 438, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcf31e87a-32", "ovs_interfaceid": "cf31e87a-32f9-480e-81ed-3c91152548dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1313.236750] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-53988dba-150a-446b-8327-5f5f0c006668 req-44f9ae36-ac28-4d04-bd95-b34b6e4b685b service nova] Releasing lock "refresh_cache-4d29dc3e-1090-49fd-83b7-96b8e6855ede" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1313.602792] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1313.602792] nova-compute[62208]: warnings.warn( [ 1313.608757] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38569, 'name': ReconfigVM_Task, 'duration_secs': 0.119033} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1313.609050] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Reconfigured VM instance to enable vnc on port - 5905 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1313.609262] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.579s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1313.609505] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1313.609653] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1313.610001] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1313.610273] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-28e12c23-b598-4240-8b1a-8c4269d9af9f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.611865] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1313.611865] nova-compute[62208]: warnings.warn( [ 1313.615542] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 1313.615542] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52be4477-4175-b591-641d-2f9b6e16a1da" [ 1313.615542] nova-compute[62208]: _type = "Task" [ 1313.615542] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1313.618443] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1313.618443] nova-compute[62208]: warnings.warn( [ 1313.624831] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52be4477-4175-b591-641d-2f9b6e16a1da, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1314.119419] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1314.119419] nova-compute[62208]: warnings.warn( [ 1314.125944] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1314.126253] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1314.126503] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1317.617151] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Acquiring lock "47cd2de6-8094-452e-afd7-aa42128a1b0c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1317.619131] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Lock "47cd2de6-8094-452e-afd7-aa42128a1b0c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1355.466276] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1355.466276] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1355.466276] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1355.466276] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1355.466276] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1355.466276] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1355.466276] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1355.466276] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1355.466276] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1355.466276] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1355.466276] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1355.466276] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1355.466969] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/ae186e34-5677-439a-975f-a31166a3c86c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1355.468696] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1355.468950] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Copying Virtual Disk [datastore2] vmware_temp/ae186e34-5677-439a-975f-a31166a3c86c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/ae186e34-5677-439a-975f-a31166a3c86c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1355.469248] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-66239099-8259-42cc-8717-867d531f71f4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1355.471483] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1355.471483] nova-compute[62208]: warnings.warn( [ 1355.478974] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Waiting for the task: (returnval){ [ 1355.478974] nova-compute[62208]: value = "task-38570" [ 1355.478974] nova-compute[62208]: _type = "Task" [ 1355.478974] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1355.482516] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1355.482516] nova-compute[62208]: warnings.warn( [ 1355.487697] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Task: {'id': task-38570, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1355.982839] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1355.982839] nova-compute[62208]: warnings.warn( [ 1355.989072] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1355.989363] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1355.989918] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Traceback (most recent call last): [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] yield resources [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] self.driver.spawn(context, instance, image_meta, [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] self._fetch_image_if_missing(context, vi) [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] image_cache(vi, tmp_image_ds_loc) [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] vm_util.copy_virtual_disk( [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] session._wait_for_task(vmdk_copy_task) [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] return self.wait_for_task(task_ref) [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] return evt.wait() [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] result = hub.switch() [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] return self.greenlet.switch() [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] self.f(*self.args, **self.kw) [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] raise exceptions.translate_fault(task_info.error) [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Faults: ['InvalidArgument'] [ 1355.989918] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] [ 1355.990910] nova-compute[62208]: INFO nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Terminating instance [ 1355.991777] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1355.991987] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1355.992450] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2ea14096-d3ca-428f-aaf5-5cf985ec80c9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1355.995086] nova-compute[62208]: DEBUG nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1355.995273] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1355.996044] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-194d0cc9-4773-4b6b-999a-a4adfeb4a183 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1355.999361] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1355.999361] nova-compute[62208]: warnings.warn( [ 1355.999728] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1355.999728] nova-compute[62208]: warnings.warn( [ 1356.004522] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1356.004768] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1532a8bd-5d7a-4d78-a728-c92340e7a00d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1356.007054] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1356.007227] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1356.007821] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.007821] nova-compute[62208]: warnings.warn( [ 1356.008224] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0db17922-819c-4cc2-8c6b-7e008952f412 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1356.010136] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.010136] nova-compute[62208]: warnings.warn( [ 1356.013027] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for the task: (returnval){ [ 1356.013027] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]527cef56-1470-f5e2-c989-bb81424aa544" [ 1356.013027] nova-compute[62208]: _type = "Task" [ 1356.013027] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1356.015979] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.015979] nova-compute[62208]: warnings.warn( [ 1356.020556] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]527cef56-1470-f5e2-c989-bb81424aa544, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1356.517178] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.517178] nova-compute[62208]: warnings.warn( [ 1356.523776] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1356.524052] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Creating directory with path [datastore2] vmware_temp/e9ea10ab-1c5e-4cc4-9185-e25c5e4dec87/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1356.524302] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1dae6e85-0edf-4a42-8114-ceedc3f78521 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1356.526082] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.526082] nova-compute[62208]: warnings.warn( [ 1356.545200] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Created directory with path [datastore2] vmware_temp/e9ea10ab-1c5e-4cc4-9185-e25c5e4dec87/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1356.545426] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Fetch image to [datastore2] vmware_temp/e9ea10ab-1c5e-4cc4-9185-e25c5e4dec87/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1356.545600] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/e9ea10ab-1c5e-4cc4-9185-e25c5e4dec87/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1356.546413] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc4cfe7c-220b-4f3a-93cc-35ba06a816fb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1356.548864] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.548864] nova-compute[62208]: warnings.warn( [ 1356.553750] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b481cce-85f1-4ff3-8e79-a6468525e701 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1356.555962] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.555962] nova-compute[62208]: warnings.warn( [ 1356.562988] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77ef2201-17f6-4052-b225-f623abb32968 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1356.566520] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.566520] nova-compute[62208]: warnings.warn( [ 1356.595262] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e68165f0-0d09-4dba-8139-20baef3051fc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1356.598027] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.598027] nova-compute[62208]: warnings.warn( [ 1356.602377] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-343b42e0-1925-4137-998f-2eb207830631 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1356.604378] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.604378] nova-compute[62208]: warnings.warn( [ 1356.629709] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1356.738623] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Traceback (most recent call last): [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] result = getattr(controller, method)(*args, **kwargs) [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self._get(image_id) [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] resp, body = self.http_client.get(url, headers=header) [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 393, in get [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self.request(url, 'GET', **kwargs) [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self._handle_response(resp) [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] raise exc.from_response(resp, resp.content) [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] During handling of the above exception, another exception occurred: [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Traceback (most recent call last): [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] yield resources [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self.driver.spawn(context, instance, image_meta, [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._fetch_image_if_missing(context, vi) [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1356.738623] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] image_fetch(context, vi, tmp_image_ds_loc) [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] images.fetch_image( [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] metadata = IMAGE_API.get(context, image_ref) [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/image/glance.py", line 1206, in get [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return session.show(context, image_id, [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] _reraise_translated_image_exception(image_id) [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/image/glance.py", line 1032, in _reraise_translated_image_exception [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] raise new_exc.with_traceback(exc_trace) [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] result = getattr(controller, method)(*args, **kwargs) [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self._get(image_id) [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] resp, body = self.http_client.get(url, headers=header) [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 393, in get [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self.request(url, 'GET', **kwargs) [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self._handle_response(resp) [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] raise exc.from_response(resp, resp.content) [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] nova.exception.ImageNotAuthorized: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1356.741242] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1356.741242] nova-compute[62208]: INFO nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Terminating instance [ 1356.741242] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1356.741242] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1356.741242] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2902f639-ec6b-495f-807c-1f2da0a150cc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1356.742859] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "refresh_cache-c4512476-9905-4f33-8575-d0a0f24ed4d5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1356.743016] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquired lock "refresh_cache-c4512476-9905-4f33-8575-d0a0f24ed4d5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1356.743184] nova-compute[62208]: DEBUG nova.network.neutron [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1356.744054] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.744054] nova-compute[62208]: warnings.warn( [ 1356.751867] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1356.752065] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1356.753282] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-589e1d30-1dad-4bd8-bc91-950010c911b7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1356.757629] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.757629] nova-compute[62208]: warnings.warn( [ 1356.761394] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Waiting for the task: (returnval){ [ 1356.761394] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52ad6e0e-c2c0-4a69-4c3a-457aa2035a9e" [ 1356.761394] nova-compute[62208]: _type = "Task" [ 1356.761394] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1356.764581] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.764581] nova-compute[62208]: warnings.warn( [ 1356.769560] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52ad6e0e-c2c0-4a69-4c3a-457aa2035a9e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1356.779442] nova-compute[62208]: DEBUG nova.network.neutron [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1356.827063] nova-compute[62208]: DEBUG nova.network.neutron [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1356.837914] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Releasing lock "refresh_cache-c4512476-9905-4f33-8575-d0a0f24ed4d5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1356.838347] nova-compute[62208]: DEBUG nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1356.838533] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1356.839636] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21601085-8c43-4d7e-ad01-ffdf36b03c8e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1356.842892] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.842892] nova-compute[62208]: warnings.warn( [ 1356.848844] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1356.848844] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b8a0115a-2aa1-46db-b461-836b818fd8f1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1356.850294] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.850294] nova-compute[62208]: warnings.warn( [ 1356.903364] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1356.903596] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1356.903596] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Deleting the datastore file [datastore2] c4512476-9905-4f33-8575-d0a0f24ed4d5 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1356.903861] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2c9d980b-6e11-4a62-b13d-ac6381f5ca2d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1356.906383] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.906383] nova-compute[62208]: warnings.warn( [ 1356.913495] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for the task: (returnval){ [ 1356.913495] nova-compute[62208]: value = "task-38573" [ 1356.913495] nova-compute[62208]: _type = "Task" [ 1356.913495] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1356.918729] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1356.918729] nova-compute[62208]: warnings.warn( [ 1356.923803] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Task: {'id': task-38573, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1357.266631] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.266631] nova-compute[62208]: warnings.warn( [ 1357.272881] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1357.273255] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Creating directory with path [datastore2] vmware_temp/42b8797c-9a5d-432a-996d-bfacfdaf9b20/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1357.273604] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8e72a2d5-af6a-41c3-84fb-fe3c07cedec8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.275531] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.275531] nova-compute[62208]: warnings.warn( [ 1357.286578] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Created directory with path [datastore2] vmware_temp/42b8797c-9a5d-432a-996d-bfacfdaf9b20/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1357.286904] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Fetch image to [datastore2] vmware_temp/42b8797c-9a5d-432a-996d-bfacfdaf9b20/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1357.287140] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/42b8797c-9a5d-432a-996d-bfacfdaf9b20/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1357.288176] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd7508db-d107-4205-b848-51a76689ae03 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.290850] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.290850] nova-compute[62208]: warnings.warn( [ 1357.295536] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce4d1447-6340-4bcb-842a-9015f4c62268 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.297782] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.297782] nova-compute[62208]: warnings.warn( [ 1357.304854] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25e63b7f-c3e7-4173-ad7e-4fc7a16d32fa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.308545] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.308545] nova-compute[62208]: warnings.warn( [ 1357.337244] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82d77ded-590f-4f4b-8e73-7cb445f258d0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.340195] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.340195] nova-compute[62208]: warnings.warn( [ 1357.344501] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7cb6f175-685f-46c3-95f1-a682ecf8c23d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.346232] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.346232] nova-compute[62208]: warnings.warn( [ 1357.365452] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1357.417703] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.417703] nova-compute[62208]: warnings.warn( [ 1357.425851] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Task: {'id': task-38573, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.035712} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1357.426088] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1357.426268] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1357.426438] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1357.426608] nova-compute[62208]: INFO nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1357.426882] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1357.427048] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1357.427139] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1357.456507] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Traceback (most recent call last): [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] result = getattr(controller, method)(*args, **kwargs) [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self._get(image_id) [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] resp, body = self.http_client.get(url, headers=header) [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 393, in get [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self.request(url, 'GET', **kwargs) [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self._handle_response(resp) [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] raise exc.from_response(resp, resp.content) [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] During handling of the above exception, another exception occurred: [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Traceback (most recent call last): [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] yield resources [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self.driver.spawn(context, instance, image_meta, [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._fetch_image_if_missing(context, vi) [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] image_fetch(context, vi, tmp_image_ds_loc) [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] images.fetch_image( [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1357.457324] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] metadata = IMAGE_API.get(context, image_ref) [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/image/glance.py", line 1206, in get [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return session.show(context, image_id, [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] _reraise_translated_image_exception(image_id) [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/image/glance.py", line 1032, in _reraise_translated_image_exception [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] raise new_exc.with_traceback(exc_trace) [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] result = getattr(controller, method)(*args, **kwargs) [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self._get(image_id) [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] resp, body = self.http_client.get(url, headers=header) [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 393, in get [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self.request(url, 'GET', **kwargs) [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self._handle_response(resp) [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] raise exc.from_response(resp, resp.content) [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] nova.exception.ImageNotAuthorized: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1357.458506] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1357.458506] nova-compute[62208]: INFO nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Terminating instance [ 1357.459666] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1357.459666] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1357.460660] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquiring lock "refresh_cache-17dbfb9d-4ec1-4937-8bb7-343101f8f61b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1357.460660] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquired lock "refresh_cache-17dbfb9d-4ec1-4937-8bb7-343101f8f61b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1357.460660] nova-compute[62208]: DEBUG nova.network.neutron [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1357.461433] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6cc0d136-e3dd-4234-a3a2-953819728e62 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.465639] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.465639] nova-compute[62208]: warnings.warn( [ 1357.472822] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1357.473172] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1357.473896] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-acba539d-8d8c-47f7-b8df-47ec3466e77a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.476032] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.476032] nova-compute[62208]: warnings.warn( [ 1357.479209] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 1357.479209] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]526535b2-8886-b152-b6d0-e23bf21dd1d7" [ 1357.479209] nova-compute[62208]: _type = "Task" [ 1357.479209] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1357.484184] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.484184] nova-compute[62208]: warnings.warn( [ 1357.489694] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]526535b2-8886-b152-b6d0-e23bf21dd1d7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1357.534621] nova-compute[62208]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62208) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1357.534888] nova-compute[62208]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.<locals>._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-4f14b5bb-2c62-4857-9086-7db43161a7c8'] [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1357.535399] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1357.536968] nova-compute[62208]: ERROR nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1357.539126] nova-compute[62208]: DEBUG nova.network.neutron [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1357.564881] nova-compute[62208]: DEBUG nova.network.neutron [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1357.574425] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Releasing lock "refresh_cache-17dbfb9d-4ec1-4937-8bb7-343101f8f61b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1357.574885] nova-compute[62208]: DEBUG nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1357.575149] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1357.576486] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b864268-443c-4b49-96e4-73917aad427c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.581266] nova-compute[62208]: WARNING nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Could not clean up failed build, not rescheduling. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1357.581497] nova-compute[62208]: DEBUG nova.compute.claims [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936796e00> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1357.581669] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1357.581901] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1357.584290] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.584290] nova-compute[62208]: warnings.warn( [ 1357.589985] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1357.590250] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ac818ecc-6d0b-4672-9eaf-b7ac324f707c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.591874] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.591874] nova-compute[62208]: warnings.warn( [ 1357.629545] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1357.629778] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1357.629962] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Deleting the datastore file [datastore2] 17dbfb9d-4ec1-4937-8bb7-343101f8f61b {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1357.630228] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f2d7fcf5-e3f3-49d8-94b9-70d481f0e177 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.632151] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.632151] nova-compute[62208]: warnings.warn( [ 1357.637717] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Waiting for the task: (returnval){ [ 1357.637717] nova-compute[62208]: value = "task-38575" [ 1357.637717] nova-compute[62208]: _type = "Task" [ 1357.637717] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1357.644212] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.644212] nova-compute[62208]: warnings.warn( [ 1357.650634] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Task: {'id': task-38575, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1357.841752] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2155243-e5c7-46fb-961c-a60dbacbcda1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.844479] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.844479] nova-compute[62208]: warnings.warn( [ 1357.849699] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69cbc138-0422-446b-9689-4fdfcbdd789e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.852636] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.852636] nova-compute[62208]: warnings.warn( [ 1357.880888] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e426bb3-aad8-4f04-a906-befd29c0c679 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.883628] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.883628] nova-compute[62208]: warnings.warn( [ 1357.889656] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2a8d464-9009-45d3-83e9-cbb310c005a5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.893380] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.893380] nova-compute[62208]: warnings.warn( [ 1357.903794] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1357.912655] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1357.933732] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.352s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1357.934130] nova-compute[62208]: DEBUG nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Build of instance c4512476-9905-4f33-8575-d0a0f24ed4d5 aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2661}} [ 1357.934997] nova-compute[62208]: DEBUG nova.compute.utils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Build of instance c4512476-9905-4f33-8575-d0a0f24ed4d5 aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1357.936991] nova-compute[62208]: ERROR nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Build of instance c4512476-9905-4f33-8575-d0a0f24ed4d5 aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7.: nova.exception.BuildAbortException: Build of instance c4512476-9905-4f33-8575-d0a0f24ed4d5 aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1357.937308] nova-compute[62208]: DEBUG nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1357.937626] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "refresh_cache-c4512476-9905-4f33-8575-d0a0f24ed4d5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1357.937911] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquired lock "refresh_cache-c4512476-9905-4f33-8575-d0a0f24ed4d5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1357.938176] nova-compute[62208]: DEBUG nova.network.neutron [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1357.979753] nova-compute[62208]: DEBUG nova.network.neutron [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1357.986584] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.986584] nova-compute[62208]: warnings.warn( [ 1357.993912] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1357.994576] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating directory with path [datastore2] vmware_temp/8bc80da1-e4e7-46a8-9971-95d273ce942a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1357.994839] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-34d29fe2-ce6a-4055-9deb-97e0f744b628 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1357.997073] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1357.997073] nova-compute[62208]: warnings.warn( [ 1358.008059] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Created directory with path [datastore2] vmware_temp/8bc80da1-e4e7-46a8-9971-95d273ce942a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1358.008316] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Fetch image to [datastore2] vmware_temp/8bc80da1-e4e7-46a8-9971-95d273ce942a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1358.008469] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/8bc80da1-e4e7-46a8-9971-95d273ce942a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1358.009531] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73114541-4ba1-488b-88c3-6ab3cf7b1cc4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.011891] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1358.011891] nova-compute[62208]: warnings.warn( [ 1358.017135] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae46ad4c-3260-4fca-b0ea-6e6d3535bb2a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.019427] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1358.019427] nova-compute[62208]: warnings.warn( [ 1358.020667] nova-compute[62208]: DEBUG nova.network.neutron [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1358.028069] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9e6cf38-1708-4e7c-9872-3c019163776f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.033807] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Releasing lock "refresh_cache-c4512476-9905-4f33-8575-d0a0f24ed4d5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1358.034022] nova-compute[62208]: DEBUG nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1358.034193] nova-compute[62208]: DEBUG nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1358.034362] nova-compute[62208]: DEBUG nova.network.neutron [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1358.037038] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1358.037038] nova-compute[62208]: warnings.warn( [ 1358.070409] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e36197d-54e8-4f7e-a0b9-f4cbdbc8a38b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.073083] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1358.073083] nova-compute[62208]: warnings.warn( [ 1358.077837] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a1027df0-ffd2-4e48-898a-98474d9f5f01 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.079880] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1358.079880] nova-compute[62208]: warnings.warn( [ 1358.098903] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1358.142207] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1358.142207] nova-compute[62208]: warnings.warn( [ 1358.148128] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Task: {'id': task-38575, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.044322} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1358.149987] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1358.150354] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1358.150543] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1358.151334] nova-compute[62208]: INFO nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Took 0.58 seconds to destroy the instance on the hypervisor. [ 1358.151596] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1358.152117] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1358.152274] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1358.169319] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8bc80da1-e4e7-46a8-9971-95d273ce942a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1358.172615] nova-compute[62208]: DEBUG neutronclient.v2_0.client [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62208) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Traceback (most recent call last): [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] exception_handler_v20(status_code, error_body) [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] raise client_exc(message=error_message, [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Neutron server returns request_ids: ['req-4f14b5bb-2c62-4857-9086-7db43161a7c8'] [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] During handling of the above exception, another exception occurred: [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Traceback (most recent call last): [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 2902, in _build_resources [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._shutdown_instance(context, instance, [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 3161, in _shutdown_instance [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._try_deallocate_network(context, instance, requested_networks) [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 3075, in _try_deallocate_network [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] with excutils.save_and_reraise_exception(): [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self.force_reraise() [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] raise self.value [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 3073, in _try_deallocate_network [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] _deallocate_network_with_retries() [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return evt.wait() [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] result = hub.switch() [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self.greenlet.switch() [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] result = func(*self.args, **self.kw) [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] result = f(*args, **kwargs) [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1358.173887] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._deallocate_network( [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self.network_api.deallocate_for_instance( [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] data = neutron.list_ports(**search_opts) [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self.list('ports', self.ports_path, retrieve_all, [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] for r in self._pagination(collection, path, **params): [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] res = self.get(path, params=params) [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self.retry_request("GET", action, body=body, [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self.do_request(method, action, body=body, [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._handle_fault_response(status_code, replybody, resp) [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] During handling of the above exception, another exception occurred: [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Traceback (most recent call last): [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 2447, in _do_build_and_run_instance [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._build_and_run_instance(context, instance, image, [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 2660, in _build_and_run_instance [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] with excutils.save_and_reraise_exception(): [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1358.174979] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self.force_reraise() [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] raise self.value [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] with self._build_resources(context, instance, [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__ [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self.gen.throw(typ, value, traceback) [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 2910, in _build_resources [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] raise exception.BuildAbortException( [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] nova.exception.BuildAbortException: Build of instance c4512476-9905-4f33-8575-d0a0f24ed4d5 aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] During handling of the above exception, another exception occurred: [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Traceback (most recent call last): [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] exception_handler_v20(status_code, error_body) [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] raise client_exc(message=error_message, [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Neutron server returns request_ids: ['req-f29cc967-afc1-47a4-9b00-ea96345bf66b'] [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] During handling of the above exception, another exception occurred: [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Traceback (most recent call last): [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 3036, in _cleanup_allocated_networks [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._deallocate_network(context, instance, requested_networks) [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self.network_api.deallocate_for_instance( [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] data = neutron.list_ports(**search_opts) [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self.list('ports', self.ports_path, retrieve_all, [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1358.176397] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] for r in self._pagination(collection, path, **params): [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] res = self.get(path, params=params) [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self.retry_request("GET", action, body=body, [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self.do_request(method, action, body=body, [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._handle_fault_response(status_code, replybody, resp) [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] raise exception.Unauthorized() [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] nova.exception.Unauthorized: Not authorized. [ 1358.177532] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1358.231737] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1358.231810] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8bc80da1-e4e7-46a8-9971-95d273ce942a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1358.290380] nova-compute[62208]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62208) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1358.290655] nova-compute[62208]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.<locals>._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-37db40b5-875f-4ad8-870e-a8651a77a367'] [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1358.291154] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1358.293181] nova-compute[62208]: ERROR nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1358.301250] nova-compute[62208]: INFO nova.scheduler.client.report [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Deleted allocations for instance c4512476-9905-4f33-8575-d0a0f24ed4d5 [ 1358.301629] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d4a98656-3fc1-4039-9f13-6a38058ee0aa tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "c4512476-9905-4f33-8575-d0a0f24ed4d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 593.226s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1358.303212] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "c4512476-9905-4f33-8575-d0a0f24ed4d5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 396.838s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1358.303464] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "c4512476-9905-4f33-8575-d0a0f24ed4d5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1358.303677] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "c4512476-9905-4f33-8575-d0a0f24ed4d5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1358.303844] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "c4512476-9905-4f33-8575-d0a0f24ed4d5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1358.305946] nova-compute[62208]: INFO nova.compute.manager [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Terminating instance [ 1358.307450] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquiring lock "refresh_cache-c4512476-9905-4f33-8575-d0a0f24ed4d5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1358.308059] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Acquired lock "refresh_cache-c4512476-9905-4f33-8575-d0a0f24ed4d5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1358.308059] nova-compute[62208]: DEBUG nova.network.neutron [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1358.314305] nova-compute[62208]: DEBUG nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1358.336046] nova-compute[62208]: WARNING nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Could not clean up failed build, not rescheduling. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1358.336316] nova-compute[62208]: DEBUG nova.compute.claims [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9360eec80> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1358.336484] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1358.336700] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1358.339872] nova-compute[62208]: DEBUG nova.network.neutron [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1358.368452] nova-compute[62208]: DEBUG nova.network.neutron [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1358.379104] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Releasing lock "refresh_cache-c4512476-9905-4f33-8575-d0a0f24ed4d5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1358.379602] nova-compute[62208]: DEBUG nova.compute.manager [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1358.379894] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1358.380444] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f9767aab-5660-485c-93e7-676eba2bcb2d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.382230] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1358.382230] nova-compute[62208]: warnings.warn( [ 1358.387580] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1358.391166] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-102ef898-5190-4c81-a29a-87d423b8d1e2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.405452] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1358.405452] nova-compute[62208]: warnings.warn( [ 1358.423963] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c4512476-9905-4f33-8575-d0a0f24ed4d5 could not be found. [ 1358.424254] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1358.424376] nova-compute[62208]: INFO nova.compute.manager [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1358.424626] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1358.424876] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1358.424973] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1358.517226] nova-compute[62208]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62208) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1358.517226] nova-compute[62208]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.<locals>._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-f6ea69c2-dae5-4952-97be-1e0eb6c49a8e'] [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1358.517359] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1358.519006] nova-compute[62208]: ERROR nova.compute.manager [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Traceback (most recent call last): [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] exception_handler_v20(status_code, error_body) [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] raise client_exc(message=error_message, [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Neutron server returns request_ids: ['req-f6ea69c2-dae5-4952-97be-1e0eb6c49a8e'] [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] During handling of the above exception, another exception occurred: [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Traceback (most recent call last): [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 3332, in do_terminate_instance [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._delete_instance(context, instance, bdms) [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 3267, in _delete_instance [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._shutdown_instance(context, instance, bdms) [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 3161, in _shutdown_instance [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._try_deallocate_network(context, instance, requested_networks) [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 3075, in _try_deallocate_network [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] with excutils.save_and_reraise_exception(): [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self.force_reraise() [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] raise self.value [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 3073, in _try_deallocate_network [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] _deallocate_network_with_retries() [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return evt.wait() [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] result = hub.switch() [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self.greenlet.switch() [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] result = func(*self.args, **self.kw) [ 1358.552240] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] result = f(*args, **kwargs) [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._deallocate_network( [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self.network_api.deallocate_for_instance( [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] data = neutron.list_ports(**search_opts) [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self.list('ports', self.ports_path, retrieve_all, [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] for r in self._pagination(collection, path, **params): [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] res = self.get(path, params=params) [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self.retry_request("GET", action, body=body, [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] return self.do_request(method, action, body=body, [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] ret = obj(*args, **kwargs) [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] self._handle_fault_response(status_code, replybody, resp) [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1358.553498] nova-compute[62208]: ERROR nova.compute.manager [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] [ 1358.581559] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Lock "c4512476-9905-4f33-8575-d0a0f24ed4d5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.278s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1358.582652] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "c4512476-9905-4f33-8575-d0a0f24ed4d5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 184.160s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1358.582838] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] During sync_power_state the instance has a pending task (deleting). Skip. [ 1358.583010] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "c4512476-9905-4f33-8575-d0a0f24ed4d5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1358.627280] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-712d8dec-6147-49a0-9bbd-bd053d22adc2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.630182] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1358.630182] nova-compute[62208]: warnings.warn( [ 1358.635390] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec3e60a7-eb8c-445f-8124-11a4057412cd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.639730] nova-compute[62208]: INFO nova.compute.manager [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] [instance: c4512476-9905-4f33-8575-d0a0f24ed4d5] Successfully reverted task state from None on failure for instance. [ 1358.641832] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1358.641832] nova-compute[62208]: warnings.warn( [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server [None req-8a442968-c85d-4768-8247-d55b8c4e4701 tempest-DeleteServersAdminTestJSON-882406685 tempest-DeleteServersAdminTestJSON-882406685-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-f6ea69c2-dae5-4952-97be-1e0eb6c49a8e'] [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1453, in decorated_function [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3344, in terminate_instance [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3339, in do_terminate_instance [ 1358.643308] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3332, in do_terminate_instance [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3267, in _delete_instance [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3161, in _shutdown_instance [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3075, in _try_deallocate_network [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3073, in _try_deallocate_network [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.644843] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1358.646450] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1358.646450] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1358.646450] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1358.646450] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1358.646450] nova-compute[62208]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1358.646450] nova-compute[62208]: ERROR oslo_messaging.rpc.server [ 1358.668155] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1ed59fb-9865-471c-a0ff-81b90dc57cc4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.670473] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1358.670473] nova-compute[62208]: warnings.warn( [ 1358.675327] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba26664c-6dac-4e0d-883b-89b511330473 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.679150] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1358.679150] nova-compute[62208]: warnings.warn( [ 1358.688676] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1358.697364] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1358.713615] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.377s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1358.713840] nova-compute[62208]: DEBUG nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Build of instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2661}} [ 1358.714553] nova-compute[62208]: DEBUG nova.compute.utils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Build of instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1358.715598] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.328s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1358.717118] nova-compute[62208]: INFO nova.compute.claims [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1358.720026] nova-compute[62208]: ERROR nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Build of instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7.: nova.exception.BuildAbortException: Build of instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1358.720215] nova-compute[62208]: DEBUG nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1358.720435] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquiring lock "refresh_cache-17dbfb9d-4ec1-4937-8bb7-343101f8f61b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1358.720582] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquired lock "refresh_cache-17dbfb9d-4ec1-4937-8bb7-343101f8f61b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1358.720737] nova-compute[62208]: DEBUG nova.network.neutron [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1358.747087] nova-compute[62208]: DEBUG nova.network.neutron [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1358.773286] nova-compute[62208]: DEBUG nova.network.neutron [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1358.781780] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Releasing lock "refresh_cache-17dbfb9d-4ec1-4937-8bb7-343101f8f61b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1358.782005] nova-compute[62208]: DEBUG nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1358.782173] nova-compute[62208]: DEBUG nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1358.782339] nova-compute[62208]: DEBUG nova.network.neutron [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1358.868441] nova-compute[62208]: DEBUG neutronclient.v2_0.client [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62208) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Traceback (most recent call last): [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] exception_handler_v20(status_code, error_body) [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] raise client_exc(message=error_message, [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Neutron server returns request_ids: ['req-37db40b5-875f-4ad8-870e-a8651a77a367'] [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] During handling of the above exception, another exception occurred: [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Traceback (most recent call last): [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 2902, in _build_resources [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._shutdown_instance(context, instance, [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 3161, in _shutdown_instance [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._try_deallocate_network(context, instance, requested_networks) [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 3075, in _try_deallocate_network [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] with excutils.save_and_reraise_exception(): [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self.force_reraise() [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] raise self.value [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 3073, in _try_deallocate_network [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] _deallocate_network_with_retries() [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return evt.wait() [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] result = hub.switch() [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self.greenlet.switch() [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] result = func(*self.args, **self.kw) [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] result = f(*args, **kwargs) [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1358.869698] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._deallocate_network( [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self.network_api.deallocate_for_instance( [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] data = neutron.list_ports(**search_opts) [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self.list('ports', self.ports_path, retrieve_all, [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] for r in self._pagination(collection, path, **params): [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] res = self.get(path, params=params) [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self.retry_request("GET", action, body=body, [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self.do_request(method, action, body=body, [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._handle_fault_response(status_code, replybody, resp) [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] During handling of the above exception, another exception occurred: [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Traceback (most recent call last): [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 2447, in _do_build_and_run_instance [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._build_and_run_instance(context, instance, image, [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 2660, in _build_and_run_instance [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] with excutils.save_and_reraise_exception(): [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1358.870814] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self.force_reraise() [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] raise self.value [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] with self._build_resources(context, instance, [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__ [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self.gen.throw(typ, value, traceback) [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 2910, in _build_resources [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] raise exception.BuildAbortException( [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] nova.exception.BuildAbortException: Build of instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b aborted: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] During handling of the above exception, another exception occurred: [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Traceback (most recent call last): [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] exception_handler_v20(status_code, error_body) [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] raise client_exc(message=error_message, [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Neutron server returns request_ids: ['req-714da69b-b632-4861-befa-032d5a86f100'] [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] During handling of the above exception, another exception occurred: [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Traceback (most recent call last): [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 3036, in _cleanup_allocated_networks [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._deallocate_network(context, instance, requested_networks) [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self.network_api.deallocate_for_instance( [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] data = neutron.list_ports(**search_opts) [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self.list('ports', self.ports_path, retrieve_all, [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1358.872635] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] for r in self._pagination(collection, path, **params): [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] res = self.get(path, params=params) [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self.retry_request("GET", action, body=body, [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self.do_request(method, action, body=body, [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._handle_fault_response(status_code, replybody, resp) [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] raise exception.Unauthorized() [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] nova.exception.Unauthorized: Not authorized. [ 1358.873801] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1358.966171] nova-compute[62208]: INFO nova.scheduler.client.report [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Deleted allocations for instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b [ 1358.966448] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4c189b09-9b4b-4018-9881-9017126bad73 tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "17dbfb9d-4ec1-4937-8bb7-343101f8f61b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 545.207s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1358.967781] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "17dbfb9d-4ec1-4937-8bb7-343101f8f61b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 348.440s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1358.968035] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquiring lock "17dbfb9d-4ec1-4937-8bb7-343101f8f61b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1358.968242] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "17dbfb9d-4ec1-4937-8bb7-343101f8f61b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1358.968408] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "17dbfb9d-4ec1-4937-8bb7-343101f8f61b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1358.974001] nova-compute[62208]: INFO nova.compute.manager [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Terminating instance [ 1358.975699] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquiring lock "refresh_cache-17dbfb9d-4ec1-4937-8bb7-343101f8f61b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1358.975854] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Acquired lock "refresh_cache-17dbfb9d-4ec1-4937-8bb7-343101f8f61b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1358.976057] nova-compute[62208]: DEBUG nova.network.neutron [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1358.981122] nova-compute[62208]: DEBUG nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1358.992860] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8dee264-ff0f-499b-a346-39d830396695 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.995492] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1358.995492] nova-compute[62208]: warnings.warn( [ 1359.002449] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61a44146-08a9-41a2-8795-91aad6414a2b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.005952] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1359.005952] nova-compute[62208]: warnings.warn( [ 1359.036515] nova-compute[62208]: DEBUG nova.network.neutron [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1359.042583] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40bfc522-ef65-4e76-8d2a-f25925396ef8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.049065] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1359.049065] nova-compute[62208]: warnings.warn( [ 1359.055950] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dd103c7-271a-416f-9a9d-260fbd584ea8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.060603] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1359.060788] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1359.060788] nova-compute[62208]: warnings.warn( [ 1359.073459] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1359.082758] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1359.097567] nova-compute[62208]: DEBUG nova.network.neutron [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1359.101174] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.386s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1359.101641] nova-compute[62208]: DEBUG nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1359.104847] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.044s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1359.105565] nova-compute[62208]: INFO nova.compute.claims [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1359.108668] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Releasing lock "refresh_cache-17dbfb9d-4ec1-4937-8bb7-343101f8f61b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1359.109112] nova-compute[62208]: DEBUG nova.compute.manager [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1359.109295] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1359.109793] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6641b746-daac-441d-b4a4-fd727798d925 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.113561] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1359.113561] nova-compute[62208]: warnings.warn( [ 1359.120777] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-847f8c01-8092-476c-a930-2748682cc6e4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.131452] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1359.131452] nova-compute[62208]: warnings.warn( [ 1359.151176] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1359.151604] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 17dbfb9d-4ec1-4937-8bb7-343101f8f61b could not be found. [ 1359.151826] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1359.152077] nova-compute[62208]: INFO nova.compute.manager [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1359.152378] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1359.153775] nova-compute[62208]: DEBUG nova.compute.utils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1359.155624] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1359.155758] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1359.157666] nova-compute[62208]: DEBUG nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1359.157895] nova-compute[62208]: DEBUG nova.network.neutron [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1359.166375] nova-compute[62208]: DEBUG nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1359.221220] nova-compute[62208]: DEBUG nova.policy [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48cf6bc9785d46088589c14e7e8c14ca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '910bab22145d4f8cbd354ecf005eed6a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1359.250610] nova-compute[62208]: DEBUG nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1359.273162] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1359.273527] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1359.273743] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1359.273985] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1359.274190] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1359.274537] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1359.274650] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1359.274883] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1359.275244] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1359.275486] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1359.275752] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1359.278969] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e76b98a-a60a-435c-8ec1-c082bd51ce4b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.284215] nova-compute[62208]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62208) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1359.284493] nova-compute[62208]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.<locals>._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-15f7ff1e-4722-4f76-870c-b670224a5165'] [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1359.285039] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1359.286343] nova-compute[62208]: ERROR nova.compute.manager [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1359.288471] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1359.288471] nova-compute[62208]: warnings.warn( [ 1359.295188] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93699d35-b30a-4afc-894c-353e19592d90 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.298971] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1359.298971] nova-compute[62208]: warnings.warn( [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Traceback (most recent call last): [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] exception_handler_v20(status_code, error_body) [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] raise client_exc(message=error_message, [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Neutron server returns request_ids: ['req-15f7ff1e-4722-4f76-870c-b670224a5165'] [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] During handling of the above exception, another exception occurred: [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Traceback (most recent call last): [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 3332, in do_terminate_instance [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._delete_instance(context, instance, bdms) [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 3267, in _delete_instance [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._shutdown_instance(context, instance, bdms) [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 3161, in _shutdown_instance [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._try_deallocate_network(context, instance, requested_networks) [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 3075, in _try_deallocate_network [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] with excutils.save_and_reraise_exception(): [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self.force_reraise() [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] raise self.value [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 3073, in _try_deallocate_network [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] _deallocate_network_with_retries() [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return evt.wait() [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] result = hub.switch() [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self.greenlet.switch() [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] result = func(*self.args, **self.kw) [ 1359.330197] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] result = f(*args, **kwargs) [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._deallocate_network( [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self.network_api.deallocate_for_instance( [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] data = neutron.list_ports(**search_opts) [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self.list('ports', self.ports_path, retrieve_all, [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] for r in self._pagination(collection, path, **params): [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] res = self.get(path, params=params) [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self.retry_request("GET", action, body=body, [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] return self.do_request(method, action, body=body, [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] ret = obj(*args, **kwargs) [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] self._handle_fault_response(status_code, replybody, resp) [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1359.331176] nova-compute[62208]: ERROR nova.compute.manager [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] [ 1359.360659] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Lock "17dbfb9d-4ec1-4937-8bb7-343101f8f61b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.393s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1359.364227] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "17dbfb9d-4ec1-4937-8bb7-343101f8f61b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 184.941s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1359.364501] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] During sync_power_state the instance has a pending task (deleting). Skip. [ 1359.364760] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "17dbfb9d-4ec1-4937-8bb7-343101f8f61b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1359.436649] nova-compute[62208]: INFO nova.compute.manager [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] [instance: 17dbfb9d-4ec1-4937-8bb7-343101f8f61b] Successfully reverted task state from None on failure for instance. [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server [None req-1ca9b6a7-2d94-4825-86e8-30c8779a63ce tempest-MigrationsAdminTest-2095859088 tempest-MigrationsAdminTest-2095859088-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-15f7ff1e-4722-4f76-870c-b670224a5165'] [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1453, in decorated_function [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3344, in terminate_instance [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3339, in do_terminate_instance [ 1359.441015] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3332, in do_terminate_instance [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3267, in _delete_instance [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3161, in _shutdown_instance [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3075, in _try_deallocate_network [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3073, in _try_deallocate_network [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1359.442441] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1359.443794] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1359.443794] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1359.443794] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1359.443794] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1359.443794] nova-compute[62208]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1359.443794] nova-compute[62208]: ERROR oslo_messaging.rpc.server [ 1359.457795] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68ac4c9b-449a-4f5b-92bc-ff6e8898ac62 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.461254] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1359.461254] nova-compute[62208]: warnings.warn( [ 1359.466967] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cf8f28e-c29b-4ad1-a7d3-3f9970c82568 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.470546] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1359.470546] nova-compute[62208]: warnings.warn( [ 1359.500515] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f312cbee-5412-4306-96bf-c9770a9d82e3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.503132] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1359.503132] nova-compute[62208]: warnings.warn( [ 1359.509096] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76a82d4d-73af-4037-8fd9-4f435943cfb3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.512952] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1359.512952] nova-compute[62208]: warnings.warn( [ 1359.523324] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1359.532501] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1359.553874] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.449s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1359.554488] nova-compute[62208]: DEBUG nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1359.594495] nova-compute[62208]: DEBUG nova.compute.utils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1359.596085] nova-compute[62208]: DEBUG nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1359.596338] nova-compute[62208]: DEBUG nova.network.neutron [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1359.621733] nova-compute[62208]: DEBUG nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1359.702130] nova-compute[62208]: DEBUG nova.policy [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '931207206d284e4db60fd5aabf7648f8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '178d7b9219794edc8a3f6879910bab4b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1359.712268] nova-compute[62208]: DEBUG nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1359.742846] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1359.742846] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1359.743050] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1359.743273] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1359.743424] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1359.743565] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1359.743771] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1359.743925] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1359.744108] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1359.744268] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1359.744443] nova-compute[62208]: DEBUG nova.virt.hardware [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1359.745324] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ef81eda-13b3-4e25-9d14-05b7889dd2e4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.748945] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1359.748945] nova-compute[62208]: warnings.warn( [ 1359.754836] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1f276b3-41af-4bb8-905c-1a82ba3f8ac0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.758771] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1359.758771] nova-compute[62208]: warnings.warn( [ 1360.051741] nova-compute[62208]: DEBUG nova.network.neutron [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Successfully created port: 34015933-cffc-43a3-a4bf-69cebe879507 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1360.327301] nova-compute[62208]: DEBUG nova.network.neutron [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Successfully created port: 8a9f5b9f-61fc-44d8-8dac-fd3773a35ace {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1360.562806] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1360.563062] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1360.563228] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Deleting the datastore file [datastore2] f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1360.563501] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5d3349ba-235d-4749-89db-461d05fbb81f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1360.565523] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1360.565523] nova-compute[62208]: warnings.warn( [ 1360.570467] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Waiting for the task: (returnval){ [ 1360.570467] nova-compute[62208]: value = "task-38576" [ 1360.570467] nova-compute[62208]: _type = "Task" [ 1360.570467] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1360.573485] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1360.573485] nova-compute[62208]: warnings.warn( [ 1360.580278] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Task: {'id': task-38576, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1360.952513] nova-compute[62208]: DEBUG nova.compute.manager [req-7bac3dc6-2630-4493-8af8-70803033fe05 req-2f141622-9ad0-47c8-b7fc-52e4f4d1f802 service nova] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Received event network-vif-plugged-34015933-cffc-43a3-a4bf-69cebe879507 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1360.952741] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-7bac3dc6-2630-4493-8af8-70803033fe05 req-2f141622-9ad0-47c8-b7fc-52e4f4d1f802 service nova] Acquiring lock "e45ad927-7d07-43d5-84b8-339c68981de6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1360.952958] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-7bac3dc6-2630-4493-8af8-70803033fe05 req-2f141622-9ad0-47c8-b7fc-52e4f4d1f802 service nova] Lock "e45ad927-7d07-43d5-84b8-339c68981de6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1360.953123] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-7bac3dc6-2630-4493-8af8-70803033fe05 req-2f141622-9ad0-47c8-b7fc-52e4f4d1f802 service nova] Lock "e45ad927-7d07-43d5-84b8-339c68981de6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1360.953289] nova-compute[62208]: DEBUG nova.compute.manager [req-7bac3dc6-2630-4493-8af8-70803033fe05 req-2f141622-9ad0-47c8-b7fc-52e4f4d1f802 service nova] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] No waiting events found dispatching network-vif-plugged-34015933-cffc-43a3-a4bf-69cebe879507 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1360.953450] nova-compute[62208]: WARNING nova.compute.manager [req-7bac3dc6-2630-4493-8af8-70803033fe05 req-2f141622-9ad0-47c8-b7fc-52e4f4d1f802 service nova] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Received unexpected event network-vif-plugged-34015933-cffc-43a3-a4bf-69cebe879507 for instance with vm_state building and task_state spawning. [ 1361.074920] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1361.074920] nova-compute[62208]: warnings.warn( [ 1361.080674] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Task: {'id': task-38576, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.083986} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1361.081441] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1361.081694] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1361.081912] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1361.082130] nova-compute[62208]: INFO nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Took 5.09 seconds to destroy the instance on the hypervisor. [ 1361.084293] nova-compute[62208]: DEBUG nova.compute.claims [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936158880> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1361.084506] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1361.084758] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1361.138704] nova-compute[62208]: DEBUG nova.network.neutron [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Successfully updated port: 34015933-cffc-43a3-a4bf-69cebe879507 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1361.150209] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "refresh_cache-e45ad927-7d07-43d5-84b8-339c68981de6" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1361.150275] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquired lock "refresh_cache-e45ad927-7d07-43d5-84b8-339c68981de6" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1361.150374] nova-compute[62208]: DEBUG nova.network.neutron [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1361.234861] nova-compute[62208]: DEBUG nova.network.neutron [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1361.393204] nova-compute[62208]: DEBUG nova.network.neutron [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Successfully updated port: 8a9f5b9f-61fc-44d8-8dac-fd3773a35ace {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1361.409390] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "refresh_cache-f000e638-100f-4a53-853d-4a94ffe71bed" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1361.412659] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquired lock "refresh_cache-f000e638-100f-4a53-853d-4a94ffe71bed" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1361.412659] nova-compute[62208]: DEBUG nova.network.neutron [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1361.442866] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb1227ed-7b57-43c1-a6dd-4bd46c9944ea {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1361.445745] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1361.445745] nova-compute[62208]: warnings.warn( [ 1361.457634] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ab8a8df-d9e5-4a5a-9708-0fe2182d58e0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1361.461505] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1361.461505] nova-compute[62208]: warnings.warn( [ 1361.494162] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cd027fa-444d-4763-8c26-fe9c5a72eb0d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1361.497123] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1361.497123] nova-compute[62208]: warnings.warn( [ 1361.498556] nova-compute[62208]: DEBUG nova.network.neutron [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1361.503625] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b73e081-dcb4-46e9-8209-a272e7f00520 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1361.508951] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1361.508951] nova-compute[62208]: warnings.warn( [ 1361.522520] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1361.533705] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1361.552849] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.468s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1361.553424] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Traceback (most recent call last): [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] self.driver.spawn(context, instance, image_meta, [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] self._fetch_image_if_missing(context, vi) [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] image_cache(vi, tmp_image_ds_loc) [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] vm_util.copy_virtual_disk( [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] session._wait_for_task(vmdk_copy_task) [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] return self.wait_for_task(task_ref) [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] return evt.wait() [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] result = hub.switch() [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] return self.greenlet.switch() [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] self.f(*self.args, **self.kw) [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] raise exceptions.translate_fault(task_info.error) [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Faults: ['InvalidArgument'] [ 1361.553424] nova-compute[62208]: ERROR nova.compute.manager [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] [ 1361.554345] nova-compute[62208]: DEBUG nova.compute.utils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1361.555692] nova-compute[62208]: DEBUG nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Build of instance f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 was re-scheduled: A specified parameter was not correct: fileType [ 1361.555692] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1361.556063] nova-compute[62208]: DEBUG nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1361.556239] nova-compute[62208]: DEBUG nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1361.556407] nova-compute[62208]: DEBUG nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1361.556569] nova-compute[62208]: DEBUG nova.network.neutron [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1361.837180] nova-compute[62208]: DEBUG nova.network.neutron [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Updating instance_info_cache with network_info: [{"id": "34015933-cffc-43a3-a4bf-69cebe879507", "address": "fa:16:3e:17:72:de", "network": {"id": "281fc297-5909-45d5-a98d-851fe6969333", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1363299703-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "910bab22145d4f8cbd354ecf005eed6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "10b81051-1eb1-406b-888c-4548c470c77e", "external-id": "nsx-vlan-transportzone-207", "segmentation_id": 207, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap34015933-cf", "ovs_interfaceid": "34015933-cffc-43a3-a4bf-69cebe879507", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1361.876441] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Releasing lock "refresh_cache-e45ad927-7d07-43d5-84b8-339c68981de6" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1361.876760] nova-compute[62208]: DEBUG nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Instance network_info: |[{"id": "34015933-cffc-43a3-a4bf-69cebe879507", "address": "fa:16:3e:17:72:de", "network": {"id": "281fc297-5909-45d5-a98d-851fe6969333", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1363299703-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "910bab22145d4f8cbd354ecf005eed6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "10b81051-1eb1-406b-888c-4548c470c77e", "external-id": "nsx-vlan-transportzone-207", "segmentation_id": 207, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap34015933-cf", "ovs_interfaceid": "34015933-cffc-43a3-a4bf-69cebe879507", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1361.877191] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:17:72:de', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '10b81051-1eb1-406b-888c-4548c470c77e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '34015933-cffc-43a3-a4bf-69cebe879507', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1361.884752] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Creating folder: Project (910bab22145d4f8cbd354ecf005eed6a). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1361.885109] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b8c4a819-a4a9-4b1b-86ab-8d668566c483 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1361.887279] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1361.887279] nova-compute[62208]: warnings.warn( [ 1361.898806] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Created folder: Project (910bab22145d4f8cbd354ecf005eed6a) in parent group-v17427. [ 1361.898806] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Creating folder: Instances. Parent ref: group-v17534. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1361.898806] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bc491a0a-1759-4e62-a33c-54e11e56ddcd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1361.901904] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1361.901904] nova-compute[62208]: warnings.warn( [ 1361.910535] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Created folder: Instances in parent group-v17534. [ 1361.910795] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1361.910989] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1361.911195] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3687d5cf-1449-4811-b8b2-0a12fa78d534 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1361.926698] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1361.926698] nova-compute[62208]: warnings.warn( [ 1361.933721] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1361.933721] nova-compute[62208]: value = "task-38579" [ 1361.933721] nova-compute[62208]: _type = "Task" [ 1361.933721] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1361.937622] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1361.937622] nova-compute[62208]: warnings.warn( [ 1361.946252] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1361.946495] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1361.946663] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38579, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1362.062242] nova-compute[62208]: DEBUG nova.network.neutron [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Updating instance_info_cache with network_info: [{"id": "8a9f5b9f-61fc-44d8-8dac-fd3773a35ace", "address": "fa:16:3e:a6:a7:f5", "network": {"id": "2a58a4a4-f5e7-409c-9873-19d373cb9375", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-608537532-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "178d7b9219794edc8a3f6879910bab4b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e5d88cd9-35a3-4ac3-9d6d-756464cd6cc5", "external-id": "nsx-vlan-transportzone-685", "segmentation_id": 685, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8a9f5b9f-61", "ovs_interfaceid": "8a9f5b9f-61fc-44d8-8dac-fd3773a35ace", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1362.085437] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Releasing lock "refresh_cache-f000e638-100f-4a53-853d-4a94ffe71bed" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1362.085768] nova-compute[62208]: DEBUG nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Instance network_info: |[{"id": "8a9f5b9f-61fc-44d8-8dac-fd3773a35ace", "address": "fa:16:3e:a6:a7:f5", "network": {"id": "2a58a4a4-f5e7-409c-9873-19d373cb9375", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-608537532-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "178d7b9219794edc8a3f6879910bab4b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e5d88cd9-35a3-4ac3-9d6d-756464cd6cc5", "external-id": "nsx-vlan-transportzone-685", "segmentation_id": 685, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8a9f5b9f-61", "ovs_interfaceid": "8a9f5b9f-61fc-44d8-8dac-fd3773a35ace", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1362.086217] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a6:a7:f5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e5d88cd9-35a3-4ac3-9d6d-756464cd6cc5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8a9f5b9f-61fc-44d8-8dac-fd3773a35ace', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1362.093609] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Creating folder: Project (178d7b9219794edc8a3f6879910bab4b). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1362.094781] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-db55ae35-f4e6-4af0-bc5e-0626fc5b63dc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.096514] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.096514] nova-compute[62208]: warnings.warn( [ 1362.106926] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Created folder: Project (178d7b9219794edc8a3f6879910bab4b) in parent group-v17427. [ 1362.107123] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Creating folder: Instances. Parent ref: group-v17537. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1362.107368] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-af807fec-f846-4b0b-9316-d9d8f7281d37 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.109156] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.109156] nova-compute[62208]: warnings.warn( [ 1362.118008] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Created folder: Instances in parent group-v17537. [ 1362.118375] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1362.118636] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1362.118913] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a3780b07-59f1-4441-b4ad-178d87033ef5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.137186] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.137186] nova-compute[62208]: warnings.warn( [ 1362.143221] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1362.143221] nova-compute[62208]: value = "task-38582" [ 1362.143221] nova-compute[62208]: _type = "Task" [ 1362.143221] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1362.146968] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.146968] nova-compute[62208]: warnings.warn( [ 1362.153217] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38582, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1362.379367] nova-compute[62208]: DEBUG nova.network.neutron [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1362.395848] nova-compute[62208]: INFO nova.compute.manager [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Took 0.84 seconds to deallocate network for instance. [ 1362.442282] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.442282] nova-compute[62208]: warnings.warn( [ 1362.448766] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38579, 'name': CreateVM_Task, 'duration_secs': 0.326232} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1362.449001] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1362.449648] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1362.449878] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1362.452842] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-685fa883-be26-4e7c-9d40-4ed4a0596254 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.464343] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.464343] nova-compute[62208]: warnings.warn( [ 1362.494468] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Reconfiguring VM instance to enable vnc on port - 5906 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1362.495297] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-6d06140e-8421-400b-9846-93b2f0e61d44 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.506753] nova-compute[62208]: INFO nova.scheduler.client.report [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Deleted allocations for instance f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 [ 1362.513281] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.513281] nova-compute[62208]: warnings.warn( [ 1362.519501] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 1362.519501] nova-compute[62208]: value = "task-38583" [ 1362.519501] nova-compute[62208]: _type = "Task" [ 1362.519501] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1362.523246] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.523246] nova-compute[62208]: warnings.warn( [ 1362.532872] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38583, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1362.534051] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4ec86499-99c2-4dee-ab4e-ba18e0090a83 tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 643.748s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1362.535298] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a7e5f04c-5449-4e98-a843-a049bd096f1f tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 447.144s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1362.535514] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a7e5f04c-5449-4e98-a843-a049bd096f1f tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Acquiring lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1362.535720] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a7e5f04c-5449-4e98-a843-a049bd096f1f tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1362.535983] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a7e5f04c-5449-4e98-a843-a049bd096f1f tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1362.538624] nova-compute[62208]: INFO nova.compute.manager [None req-a7e5f04c-5449-4e98-a843-a049bd096f1f tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Terminating instance [ 1362.540728] nova-compute[62208]: DEBUG nova.compute.manager [None req-a7e5f04c-5449-4e98-a843-a049bd096f1f tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1362.540953] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e5f04c-5449-4e98-a843-a049bd096f1f tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1362.541217] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0b661ddd-ce92-4e88-b33f-e7e7b9329020 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.543154] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.543154] nova-compute[62208]: warnings.warn( [ 1362.550807] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5eaef65-8e53-4ab9-8462-2fc68c588dca {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.563118] nova-compute[62208]: DEBUG nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1362.565778] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.565778] nova-compute[62208]: warnings.warn( [ 1362.585625] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-a7e5f04c-5449-4e98-a843-a049bd096f1f tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1 could not be found. [ 1362.585943] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e5f04c-5449-4e98-a843-a049bd096f1f tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1362.586153] nova-compute[62208]: INFO nova.compute.manager [None req-a7e5f04c-5449-4e98-a843-a049bd096f1f tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1362.586466] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-a7e5f04c-5449-4e98-a843-a049bd096f1f tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1362.586724] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1362.586825] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1362.633876] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1362.648122] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1362.648389] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1362.649929] nova-compute[62208]: INFO nova.compute.claims [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1362.653028] nova-compute[62208]: INFO nova.compute.manager [-] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] Took 0.07 seconds to deallocate network for instance. [ 1362.653390] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.653390] nova-compute[62208]: warnings.warn( [ 1362.661592] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38582, 'name': CreateVM_Task} progress is 99%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1362.776736] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a7e5f04c-5449-4e98-a843-a049bd096f1f tempest-ServerRescueNegativeTestJSON-959391720 tempest-ServerRescueNegativeTestJSON-959391720-project-member] Lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.241s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1362.777559] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 188.355s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1362.777738] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1] During sync_power_state the instance has a pending task (deleting). Skip. [ 1362.777958] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "f4112bb2-e8b7-435c-9fe8-7c4dbe55b3e1" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1362.935769] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22160145-24e6-4f64-8585-2c064f1fe7d7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.938479] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.938479] nova-compute[62208]: warnings.warn( [ 1362.943825] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d49c98f3-d560-4d2e-a663-027dbb101082 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.946938] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.946938] nova-compute[62208]: warnings.warn( [ 1362.976465] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83883d3d-9694-4125-84bd-56c9ac974377 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.979568] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.979568] nova-compute[62208]: warnings.warn( [ 1362.985902] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f75e6bd-1d1d-4d51-89d9-eb5280bab211 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.992870] nova-compute[62208]: DEBUG nova.compute.manager [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Received event network-vif-plugged-8a9f5b9f-61fc-44d8-8dac-fd3773a35ace {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1362.993165] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] Acquiring lock "f000e638-100f-4a53-853d-4a94ffe71bed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1362.993458] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] Lock "f000e638-100f-4a53-853d-4a94ffe71bed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1362.993685] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] Lock "f000e638-100f-4a53-853d-4a94ffe71bed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1362.993907] nova-compute[62208]: DEBUG nova.compute.manager [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] No waiting events found dispatching network-vif-plugged-8a9f5b9f-61fc-44d8-8dac-fd3773a35ace {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1362.994382] nova-compute[62208]: WARNING nova.compute.manager [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Received unexpected event network-vif-plugged-8a9f5b9f-61fc-44d8-8dac-fd3773a35ace for instance with vm_state building and task_state spawning. [ 1362.994382] nova-compute[62208]: DEBUG nova.compute.manager [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Received event network-changed-34015933-cffc-43a3-a4bf-69cebe879507 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1362.994499] nova-compute[62208]: DEBUG nova.compute.manager [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Refreshing instance network info cache due to event network-changed-34015933-cffc-43a3-a4bf-69cebe879507. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1362.994718] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] Acquiring lock "refresh_cache-e45ad927-7d07-43d5-84b8-339c68981de6" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1362.994902] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] Acquired lock "refresh_cache-e45ad927-7d07-43d5-84b8-339c68981de6" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1362.995123] nova-compute[62208]: DEBUG nova.network.neutron [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Refreshing network info cache for port 34015933-cffc-43a3-a4bf-69cebe879507 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1362.996270] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1362.996270] nova-compute[62208]: warnings.warn( [ 1363.006740] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1363.016056] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1363.024802] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1363.024802] nova-compute[62208]: warnings.warn( [ 1363.030938] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38583, 'name': ReconfigVM_Task, 'duration_secs': 0.114692} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1363.031323] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Reconfigured VM instance to enable vnc on port - 5906 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1363.031596] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.582s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1363.031929] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1363.032158] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1363.032522] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1363.032846] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-00ac500d-22a2-45c5-839e-9907a5a06ece {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.035478] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.387s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1363.036042] nova-compute[62208]: DEBUG nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1363.038407] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1363.038407] nova-compute[62208]: warnings.warn( [ 1363.044909] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 1363.044909] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5227c9ee-056a-4d81-56af-f7c827d9accd" [ 1363.044909] nova-compute[62208]: _type = "Task" [ 1363.044909] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1363.048569] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1363.048569] nova-compute[62208]: warnings.warn( [ 1363.054656] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5227c9ee-056a-4d81-56af-f7c827d9accd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1363.078163] nova-compute[62208]: DEBUG nova.compute.utils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1363.079706] nova-compute[62208]: DEBUG nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1363.079969] nova-compute[62208]: DEBUG nova.network.neutron [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1363.092414] nova-compute[62208]: DEBUG nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1363.140841] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1363.147897] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1363.147897] nova-compute[62208]: warnings.warn( [ 1363.156086] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38582, 'name': CreateVM_Task, 'duration_secs': 0.547606} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1363.156365] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1363.159398] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1363.159721] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1363.162750] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12ff4838-0046-4efc-8b28-d6032d249a92 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.175310] nova-compute[62208]: DEBUG nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1363.184507] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1363.184507] nova-compute[62208]: warnings.warn( [ 1363.201964] nova-compute[62208]: DEBUG nova.policy [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e8cb00a6413b46fcb17cbe532a0bffc5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '53b578fa6aa34a2d80eb9938d58ffe12', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1363.208754] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Reconfiguring VM instance to enable vnc on port - 5907 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1363.211438] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1363.212128] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1363.212364] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1363.212614] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1363.212820] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1363.213025] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1363.213289] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1363.213507] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1363.213733] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1363.213959] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1363.214204] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1363.214855] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-e0ca81bd-826d-4413-b143-f71783345449 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.226472] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adf12b38-c13d-41dd-bfd4-d78fe7daece1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.231819] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1363.231819] nova-compute[62208]: warnings.warn( [ 1363.232233] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1363.232233] nova-compute[62208]: warnings.warn( [ 1363.238717] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-075065b3-cab6-4aa1-a110-0a8e3d69b6c4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.242715] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for the task: (returnval){ [ 1363.242715] nova-compute[62208]: value = "task-38584" [ 1363.242715] nova-compute[62208]: _type = "Task" [ 1363.242715] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1363.242999] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1363.242999] nova-compute[62208]: warnings.warn( [ 1363.253924] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1363.253924] nova-compute[62208]: warnings.warn( [ 1363.259204] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': task-38584, 'name': ReconfigVM_Task} progress is 14%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1363.550179] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1363.550179] nova-compute[62208]: warnings.warn( [ 1363.565180] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1363.565180] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1363.565180] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1363.599829] nova-compute[62208]: DEBUG nova.network.neutron [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Updated VIF entry in instance network info cache for port 34015933-cffc-43a3-a4bf-69cebe879507. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1363.600414] nova-compute[62208]: DEBUG nova.network.neutron [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Updating instance_info_cache with network_info: [{"id": "34015933-cffc-43a3-a4bf-69cebe879507", "address": "fa:16:3e:17:72:de", "network": {"id": "281fc297-5909-45d5-a98d-851fe6969333", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1363299703-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "910bab22145d4f8cbd354ecf005eed6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "10b81051-1eb1-406b-888c-4548c470c77e", "external-id": "nsx-vlan-transportzone-207", "segmentation_id": 207, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap34015933-cf", "ovs_interfaceid": "34015933-cffc-43a3-a4bf-69cebe879507", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1363.611914] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] Releasing lock "refresh_cache-e45ad927-7d07-43d5-84b8-339c68981de6" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1363.612212] nova-compute[62208]: DEBUG nova.compute.manager [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Received event network-changed-8a9f5b9f-61fc-44d8-8dac-fd3773a35ace {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1363.612388] nova-compute[62208]: DEBUG nova.compute.manager [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Refreshing instance network info cache due to event network-changed-8a9f5b9f-61fc-44d8-8dac-fd3773a35ace. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1363.612605] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] Acquiring lock "refresh_cache-f000e638-100f-4a53-853d-4a94ffe71bed" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1363.612746] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] Acquired lock "refresh_cache-f000e638-100f-4a53-853d-4a94ffe71bed" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1363.612912] nova-compute[62208]: DEBUG nova.network.neutron [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Refreshing network info cache for port 8a9f5b9f-61fc-44d8-8dac-fd3773a35ace {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1363.660401] nova-compute[62208]: DEBUG nova.network.neutron [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Successfully created port: d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1363.752493] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1363.752493] nova-compute[62208]: warnings.warn( [ 1363.758646] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': task-38584, 'name': ReconfigVM_Task, 'duration_secs': 0.118335} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1363.758950] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Reconfigured VM instance to enable vnc on port - 5907 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1363.759171] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.599s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1363.759415] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1363.759559] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1363.759869] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1363.760490] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b17ad9d9-f4a5-47bc-993e-57ef7c23f1a6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.769184] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1363.769184] nova-compute[62208]: warnings.warn( [ 1363.773178] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for the task: (returnval){ [ 1363.773178] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]521dff80-7601-bbb1-6c1c-09538706cf47" [ 1363.773178] nova-compute[62208]: _type = "Task" [ 1363.773178] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1363.776535] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1363.776535] nova-compute[62208]: warnings.warn( [ 1363.782544] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]521dff80-7601-bbb1-6c1c-09538706cf47, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1364.065789] nova-compute[62208]: DEBUG nova.network.neutron [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Updated VIF entry in instance network info cache for port 8a9f5b9f-61fc-44d8-8dac-fd3773a35ace. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1364.066182] nova-compute[62208]: DEBUG nova.network.neutron [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Updating instance_info_cache with network_info: [{"id": "8a9f5b9f-61fc-44d8-8dac-fd3773a35ace", "address": "fa:16:3e:a6:a7:f5", "network": {"id": "2a58a4a4-f5e7-409c-9873-19d373cb9375", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-608537532-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "178d7b9219794edc8a3f6879910bab4b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e5d88cd9-35a3-4ac3-9d6d-756464cd6cc5", "external-id": "nsx-vlan-transportzone-685", "segmentation_id": 685, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8a9f5b9f-61", "ovs_interfaceid": "8a9f5b9f-61fc-44d8-8dac-fd3773a35ace", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1364.076657] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-93bda1d8-99dc-44cd-9219-2b9c9d0177ed req-809eed9c-5722-499b-bb2d-1a4fa33658a7 service nova] Releasing lock "refresh_cache-f000e638-100f-4a53-853d-4a94ffe71bed" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1364.136260] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1364.140993] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1364.141175] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1364.141301] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1364.162437] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1364.162604] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1364.162737] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1364.162866] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1364.162991] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1364.163113] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1364.163233] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1364.163350] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1364.163467] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1364.163584] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1364.163703] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1364.164264] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1364.277541] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1364.277541] nova-compute[62208]: warnings.warn( [ 1364.284401] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1364.284669] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1364.284889] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1364.477224] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-62685637-0469-4eea-9b07-53ea8fba9cca tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "c0d7e5a6-e905-47ee-87d7-cda8543be1f2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1364.510208] nova-compute[62208]: DEBUG nova.network.neutron [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Successfully updated port: d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1364.523248] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "refresh_cache-bbb642f2-aa5c-4e71-b25b-e32acd45f879" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1364.523481] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "refresh_cache-bbb642f2-aa5c-4e71-b25b-e32acd45f879" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1364.523552] nova-compute[62208]: DEBUG nova.network.neutron [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1364.804624] nova-compute[62208]: DEBUG nova.network.neutron [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1364.982235] nova-compute[62208]: DEBUG nova.network.neutron [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Updating instance_info_cache with network_info: [{"id": "d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521", "address": "fa:16:3e:0a:c8:3f", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd13dd7fc-9b", "ovs_interfaceid": "d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1365.005194] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "refresh_cache-bbb642f2-aa5c-4e71-b25b-e32acd45f879" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1365.005547] nova-compute[62208]: DEBUG nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Instance network_info: |[{"id": "d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521", "address": "fa:16:3e:0a:c8:3f", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd13dd7fc-9b", "ovs_interfaceid": "d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1365.006423] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0a:c8:3f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4b5f9472-1844-4c99-8804-8f193cfff562', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd13dd7fc-9b6f-4bef-81ab-0e97ac5c8521', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1365.015231] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating folder: Project (53b578fa6aa34a2d80eb9938d58ffe12). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1365.015937] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-21d19bac-7f87-4a79-a0e7-225054283c35 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.017905] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1365.017905] nova-compute[62208]: warnings.warn( [ 1365.027288] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created folder: Project (53b578fa6aa34a2d80eb9938d58ffe12) in parent group-v17427. [ 1365.027288] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating folder: Instances. Parent ref: group-v17540. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1365.027288] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9e761634-7d10-4bd4-bab0-56d99376cfe1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.029662] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1365.029662] nova-compute[62208]: warnings.warn( [ 1365.040780] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created folder: Instances in parent group-v17540. [ 1365.041092] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1365.041456] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1365.041539] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8d376888-97b4-43e5-9f71-56378f406589 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.058253] nova-compute[62208]: DEBUG nova.compute.manager [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Received event network-vif-plugged-d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1365.058513] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] Acquiring lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1365.058741] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] Lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.058940] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] Lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.059122] nova-compute[62208]: DEBUG nova.compute.manager [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] No waiting events found dispatching network-vif-plugged-d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1365.059311] nova-compute[62208]: WARNING nova.compute.manager [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Received unexpected event network-vif-plugged-d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521 for instance with vm_state building and task_state spawning. [ 1365.059491] nova-compute[62208]: DEBUG nova.compute.manager [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Received event network-changed-d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1365.059648] nova-compute[62208]: DEBUG nova.compute.manager [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Refreshing instance network info cache due to event network-changed-d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1365.059836] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] Acquiring lock "refresh_cache-bbb642f2-aa5c-4e71-b25b-e32acd45f879" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1365.059977] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] Acquired lock "refresh_cache-bbb642f2-aa5c-4e71-b25b-e32acd45f879" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1365.060188] nova-compute[62208]: DEBUG nova.network.neutron [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Refreshing network info cache for port d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1365.061306] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1365.061306] nova-compute[62208]: warnings.warn( [ 1365.067169] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1365.067169] nova-compute[62208]: value = "task-38587" [ 1365.067169] nova-compute[62208]: _type = "Task" [ 1365.067169] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1365.073323] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1365.073323] nova-compute[62208]: warnings.warn( [ 1365.079452] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38587, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1365.141460] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1365.413870] nova-compute[62208]: DEBUG nova.network.neutron [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Updated VIF entry in instance network info cache for port d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1365.414261] nova-compute[62208]: DEBUG nova.network.neutron [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Updating instance_info_cache with network_info: [{"id": "d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521", "address": "fa:16:3e:0a:c8:3f", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd13dd7fc-9b", "ovs_interfaceid": "d13dd7fc-9b6f-4bef-81ab-0e97ac5c8521", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1365.424242] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2c234e6f-d7c7-40b2-ae58-714edec744f2 req-da8911cc-d9af-4747-83b3-23d0f57009a6 service nova] Releasing lock "refresh_cache-bbb642f2-aa5c-4e71-b25b-e32acd45f879" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1365.571420] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1365.571420] nova-compute[62208]: warnings.warn( [ 1365.577621] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38587, 'name': CreateVM_Task, 'duration_secs': 0.315466} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1365.577832] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1365.578559] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1365.578790] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.582151] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec3f8d13-7ee9-4a36-ac29-316b24d4e634 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.594053] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1365.594053] nova-compute[62208]: warnings.warn( [ 1365.619573] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Reconfiguring VM instance to enable vnc on port - 5908 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1365.620037] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-b788a60f-9fe4-4c7b-875d-6dfb13951fbb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.629941] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1365.629941] nova-compute[62208]: warnings.warn( [ 1365.637918] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 1365.637918] nova-compute[62208]: value = "task-38588" [ 1365.637918] nova-compute[62208]: _type = "Task" [ 1365.637918] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1365.641003] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1365.641003] nova-compute[62208]: warnings.warn( [ 1365.646352] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38588, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1366.142841] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1366.143849] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1366.143849] nova-compute[62208]: warnings.warn( [ 1366.150412] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38588, 'name': ReconfigVM_Task, 'duration_secs': 0.111524} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1366.150701] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Reconfigured VM instance to enable vnc on port - 5908 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1366.150917] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.572s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1366.151166] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1366.151311] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1366.151636] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1366.152547] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0f219278-d8b9-4f97-974f-f6d9f82fcc43 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.154890] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1366.155168] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1366.155346] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1366.155495] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1366.156663] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc2d126b-dae5-41a4-a78a-81bf551aa996 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.159402] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1366.159402] nova-compute[62208]: warnings.warn( [ 1366.159971] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1366.159971] nova-compute[62208]: warnings.warn( [ 1366.163230] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 1366.163230] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5251ef89-bf5c-e29c-d049-6373de1237b7" [ 1366.163230] nova-compute[62208]: _type = "Task" [ 1366.163230] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1366.169515] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f50cdbd-5590-45ac-8662-44628a8cecf8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.173474] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1366.173474] nova-compute[62208]: warnings.warn( [ 1366.173883] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1366.173883] nova-compute[62208]: warnings.warn( [ 1366.178745] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5251ef89-bf5c-e29c-d049-6373de1237b7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1366.188346] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1b08f20-7d48-4bac-b459-48c45225ec9f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.191172] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1366.191172] nova-compute[62208]: warnings.warn( [ 1366.195991] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b03d4da7-486e-4f7b-a5d5-9820993d29ab {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.199700] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1366.199700] nova-compute[62208]: warnings.warn( [ 1366.226568] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181965MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1366.226804] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1366.227061] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1366.295037] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d5c7531e-b496-4aed-be05-f1a96391e327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1366.295281] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d1a22d6e-d913-47de-9188-507d2475f745 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1366.295476] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7f79eba6-e15c-4402-b46b-028d552a81d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1366.295652] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8f74dd97-b43c-49ef-8a83-401329ebfbdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1366.295834] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1366.296045] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c0d7e5a6-e905-47ee-87d7-cda8543be1f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1366.296240] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4d29dc3e-1090-49fd-83b7-96b8e6855ede actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1366.296442] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e45ad927-7d07-43d5-84b8-339c68981de6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1366.296620] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f000e638-100f-4a53-853d-4a94ffe71bed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1366.296787] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bbb642f2-aa5c-4e71-b25b-e32acd45f879 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1366.307904] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1366.318522] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 795665a3-58eb-4d8a-bedc-84e399e11bb7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1366.328201] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance a72bbc63-b475-4be7-a412-f0f893a094f4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1366.339422] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b7855bc3-7f66-4755-b8bb-82604ae49df5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1366.348982] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 47cd2de6-8094-452e-afd7-aa42128a1b0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1366.359215] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1366.359444] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1366.359588] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1366.576102] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee5bde67-e4d0-4f3b-a8e8-df1f01f70042 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.578726] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1366.578726] nova-compute[62208]: warnings.warn( [ 1366.584224] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f63b35b-00bd-424c-9d97-586e9b039f03 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.590184] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1366.590184] nova-compute[62208]: warnings.warn( [ 1366.621836] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be1e66d5-dad0-4af5-82d9-a8fa27f7dbfc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.626110] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1366.626110] nova-compute[62208]: warnings.warn( [ 1366.633334] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ada4314-ebef-4f03-a558-e7eba7ffce0b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.637287] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1366.637287] nova-compute[62208]: warnings.warn( [ 1366.647333] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1366.657688] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1366.669765] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1366.669765] nova-compute[62208]: warnings.warn( [ 1366.676186] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1366.676483] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1366.676710] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1366.677795] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1366.678080] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.451s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1367.443247] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f6fbcafb-d8c0-4a39-8816-7fd780fd6571 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Acquiring lock "4d29dc3e-1090-49fd-83b7-96b8e6855ede" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1367.676733] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1369.136240] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1369.176035] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1369.176035] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1398.961599] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2eae14b9-3a91-464e-b7c4-049bb571b3c2 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "e45ad927-7d07-43d5-84b8-339c68981de6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1405.484308] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1405.484308] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1405.484308] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1405.484308] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1405.484308] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1405.484308] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1405.484308] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1405.484308] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1405.484308] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1405.484308] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1405.484308] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1405.484308] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1405.484903] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/8bc80da1-e4e7-46a8-9971-95d273ce942a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1405.487151] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1405.487443] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Copying Virtual Disk [datastore2] vmware_temp/8bc80da1-e4e7-46a8-9971-95d273ce942a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/8bc80da1-e4e7-46a8-9971-95d273ce942a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1405.487776] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-15f2bd32-f14e-48ed-a301-db75fc86e09e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1405.490385] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1405.490385] nova-compute[62208]: warnings.warn( [ 1405.496630] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 1405.496630] nova-compute[62208]: value = "task-38589" [ 1405.496630] nova-compute[62208]: _type = "Task" [ 1405.496630] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1405.499843] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1405.499843] nova-compute[62208]: warnings.warn( [ 1405.505468] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38589, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1406.001727] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.001727] nova-compute[62208]: warnings.warn( [ 1406.008696] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1406.009046] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1406.009704] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Traceback (most recent call last): [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] yield resources [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] self.driver.spawn(context, instance, image_meta, [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] self._fetch_image_if_missing(context, vi) [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] image_cache(vi, tmp_image_ds_loc) [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] vm_util.copy_virtual_disk( [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] session._wait_for_task(vmdk_copy_task) [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] return self.wait_for_task(task_ref) [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] return evt.wait() [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] result = hub.switch() [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] return self.greenlet.switch() [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] self.f(*self.args, **self.kw) [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] raise exceptions.translate_fault(task_info.error) [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Faults: ['InvalidArgument'] [ 1406.009704] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] [ 1406.010707] nova-compute[62208]: INFO nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Terminating instance [ 1406.011691] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1406.011925] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1406.012194] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-447244b1-8cb8-40f5-9576-eba43a3065b1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1406.014743] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1406.014954] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1406.015795] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0517ed26-e7b5-4d44-920c-be275e2b66fb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1406.018333] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.018333] nova-compute[62208]: warnings.warn( [ 1406.018682] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.018682] nova-compute[62208]: warnings.warn( [ 1406.023301] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1406.023540] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5f8802a1-c091-4b7e-9905-d60ed3356a45 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1406.025913] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1406.026125] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1406.026731] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.026731] nova-compute[62208]: warnings.warn( [ 1406.027216] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-28107515-6baa-4244-af58-e5f74ff78013 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1406.030208] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.030208] nova-compute[62208]: warnings.warn( [ 1406.033308] nova-compute[62208]: DEBUG oslo_vmware.api [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Waiting for the task: (returnval){ [ 1406.033308] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52b1fef9-b5d4-5e3c-21ae-e0385b4b55ef" [ 1406.033308] nova-compute[62208]: _type = "Task" [ 1406.033308] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1406.036336] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.036336] nova-compute[62208]: warnings.warn( [ 1406.042141] nova-compute[62208]: DEBUG oslo_vmware.api [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52b1fef9-b5d4-5e3c-21ae-e0385b4b55ef, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1406.096466] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1406.096752] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1406.097003] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Deleting the datastore file [datastore2] d5c7531e-b496-4aed-be05-f1a96391e327 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1406.097387] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4920e2b6-e52d-4f03-aaa9-a95b6ccd710a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1406.099334] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.099334] nova-compute[62208]: warnings.warn( [ 1406.103896] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 1406.103896] nova-compute[62208]: value = "task-38591" [ 1406.103896] nova-compute[62208]: _type = "Task" [ 1406.103896] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1406.107383] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.107383] nova-compute[62208]: warnings.warn( [ 1406.112446] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38591, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1406.537448] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.537448] nova-compute[62208]: warnings.warn( [ 1406.544884] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1406.545115] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Creating directory with path [datastore2] vmware_temp/fabae672-e0dc-45b1-9ab3-34495e136b0f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1406.545366] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9ed3bff6-8e3b-4301-aa3d-4949f37e96bf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1406.547135] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.547135] nova-compute[62208]: warnings.warn( [ 1406.557244] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Created directory with path [datastore2] vmware_temp/fabae672-e0dc-45b1-9ab3-34495e136b0f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1406.557456] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Fetch image to [datastore2] vmware_temp/fabae672-e0dc-45b1-9ab3-34495e136b0f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1406.557628] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/fabae672-e0dc-45b1-9ab3-34495e136b0f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1406.558587] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce4b5221-7bd8-4e9c-9962-8ce1d55f102c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1406.561186] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.561186] nova-compute[62208]: warnings.warn( [ 1406.566277] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45ae9e54-0b8e-480a-a1eb-5a50528d7c4b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1406.568757] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.568757] nova-compute[62208]: warnings.warn( [ 1406.576876] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71a89a20-88ea-4536-87a0-9a847f362f60 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1406.580750] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.580750] nova-compute[62208]: warnings.warn( [ 1406.629090] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0574f25-eb7a-4009-b711-109ba412a77a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1406.631317] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.631317] nova-compute[62208]: warnings.warn( [ 1406.631713] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.631713] nova-compute[62208]: warnings.warn( [ 1406.638503] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-046e362c-d07e-4e16-8e99-b4975af1998d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1406.640312] nova-compute[62208]: DEBUG oslo_vmware.api [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38591, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073717} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1406.640554] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1406.640737] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1406.640910] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1406.641088] nova-compute[62208]: INFO nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1406.642457] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1406.642457] nova-compute[62208]: warnings.warn( [ 1406.643209] nova-compute[62208]: DEBUG nova.compute.claims [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9363120b0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1406.643403] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1406.643614] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1406.666543] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1406.739381] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fabae672-e0dc-45b1-9ab3-34495e136b0f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1406.793882] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1406.794062] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fabae672-e0dc-45b1-9ab3-34495e136b0f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1407.008177] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f51e3dd-ce75-4d3d-9a6c-5fa512bb8711 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1407.011355] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1407.011355] nova-compute[62208]: warnings.warn( [ 1407.016754] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05ddd413-b15c-4723-a469-d89183b4a95e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1407.021473] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1407.021473] nova-compute[62208]: warnings.warn( [ 1407.052018] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bafa7ebb-1b80-4d3e-9357-9a6ad5b53428 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1407.054634] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1407.054634] nova-compute[62208]: warnings.warn( [ 1407.060922] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e39b057-3fc9-4f1c-ba26-9874c9446d03 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1407.064937] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1407.064937] nova-compute[62208]: warnings.warn( [ 1407.074847] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1407.084734] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1407.103151] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.459s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1407.103680] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Traceback (most recent call last): [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] self.driver.spawn(context, instance, image_meta, [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] self._fetch_image_if_missing(context, vi) [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] image_cache(vi, tmp_image_ds_loc) [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] vm_util.copy_virtual_disk( [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] session._wait_for_task(vmdk_copy_task) [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] return self.wait_for_task(task_ref) [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] return evt.wait() [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] result = hub.switch() [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] return self.greenlet.switch() [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] self.f(*self.args, **self.kw) [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] raise exceptions.translate_fault(task_info.error) [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Faults: ['InvalidArgument'] [ 1407.103680] nova-compute[62208]: ERROR nova.compute.manager [instance: d5c7531e-b496-4aed-be05-f1a96391e327] [ 1407.104462] nova-compute[62208]: DEBUG nova.compute.utils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1407.105956] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Build of instance d5c7531e-b496-4aed-be05-f1a96391e327 was re-scheduled: A specified parameter was not correct: fileType [ 1407.105956] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1407.106511] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1407.106778] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1407.107023] nova-compute[62208]: DEBUG nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1407.107182] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1407.462206] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "ad00920b-3783-4c01-bb25-4f923d29dad7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1407.462762] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "ad00920b-3783-4c01-bb25-4f923d29dad7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1407.708494] nova-compute[62208]: DEBUG nova.network.neutron [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1407.723544] nova-compute[62208]: INFO nova.compute.manager [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Took 0.62 seconds to deallocate network for instance. [ 1407.822539] nova-compute[62208]: INFO nova.scheduler.client.report [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Deleted allocations for instance d5c7531e-b496-4aed-be05-f1a96391e327 [ 1407.846832] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-3d95f8f4-42e1-4ee0-a842-46d0c2b7bbdb tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "d5c7531e-b496-4aed-be05-f1a96391e327" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 539.272s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1407.848135] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f0e093e-dc6a-4e4e-a435-a65553eeae3d tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "d5c7531e-b496-4aed-be05-f1a96391e327" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 342.985s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1407.848351] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f0e093e-dc6a-4e4e-a435-a65553eeae3d tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "d5c7531e-b496-4aed-be05-f1a96391e327-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1407.848554] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f0e093e-dc6a-4e4e-a435-a65553eeae3d tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "d5c7531e-b496-4aed-be05-f1a96391e327-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1407.848716] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f0e093e-dc6a-4e4e-a435-a65553eeae3d tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "d5c7531e-b496-4aed-be05-f1a96391e327-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1407.851524] nova-compute[62208]: INFO nova.compute.manager [None req-8f0e093e-dc6a-4e4e-a435-a65553eeae3d tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Terminating instance [ 1407.853273] nova-compute[62208]: DEBUG nova.compute.manager [None req-8f0e093e-dc6a-4e4e-a435-a65553eeae3d tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1407.853461] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f0e093e-dc6a-4e4e-a435-a65553eeae3d tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1407.853961] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-db339c47-dc2e-4082-a4aa-9e97bd098868 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1407.856171] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1407.856171] nova-compute[62208]: warnings.warn( [ 1407.868825] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a55ff4af-6a70-4693-ae82-343814a3a177 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1407.881358] nova-compute[62208]: DEBUG nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1407.884126] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1407.884126] nova-compute[62208]: warnings.warn( [ 1407.905579] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-8f0e093e-dc6a-4e4e-a435-a65553eeae3d tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d5c7531e-b496-4aed-be05-f1a96391e327 could not be found. [ 1407.905824] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f0e093e-dc6a-4e4e-a435-a65553eeae3d tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1407.906034] nova-compute[62208]: INFO nova.compute.manager [None req-8f0e093e-dc6a-4e4e-a435-a65553eeae3d tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1407.906286] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-8f0e093e-dc6a-4e4e-a435-a65553eeae3d tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1407.906530] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1407.906620] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1407.959676] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1407.970802] nova-compute[62208]: INFO nova.compute.manager [-] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] Took 0.06 seconds to deallocate network for instance. [ 1407.971906] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1407.972150] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1407.973621] nova-compute[62208]: INFO nova.compute.claims [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1408.098340] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f0e093e-dc6a-4e4e-a435-a65553eeae3d tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "d5c7531e-b496-4aed-be05-f1a96391e327" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.250s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1408.099233] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "d5c7531e-b496-4aed-be05-f1a96391e327" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 233.676s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1408.099424] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d5c7531e-b496-4aed-be05-f1a96391e327] During sync_power_state the instance has a pending task (deleting). Skip. [ 1408.099597] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "d5c7531e-b496-4aed-be05-f1a96391e327" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1408.296604] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d051828-5b44-4247-ab7e-29db48c08184 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1408.300250] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1408.300250] nova-compute[62208]: warnings.warn( [ 1408.307149] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5f5fa5c-d75a-49d8-bbc5-da0f50653e79 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1408.311323] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1408.311323] nova-compute[62208]: warnings.warn( [ 1408.343252] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26932fb3-6041-446d-9bc7-dbc77ec6b287 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1408.345759] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1408.345759] nova-compute[62208]: warnings.warn( [ 1408.351540] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91e7e070-40a4-4a9d-b723-4796bfc1759c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1408.356561] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1408.356561] nova-compute[62208]: warnings.warn( [ 1408.366515] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1408.377468] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1408.395004] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.423s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1408.395616] nova-compute[62208]: DEBUG nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1408.431681] nova-compute[62208]: DEBUG nova.compute.utils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1408.433240] nova-compute[62208]: DEBUG nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1408.433401] nova-compute[62208]: DEBUG nova.network.neutron [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1408.444198] nova-compute[62208]: DEBUG nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1408.485780] nova-compute[62208]: DEBUG nova.policy [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c80e7a39b71446eb86cc235973d9eb55', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7a607d976ab14539a9d204c9437c3522', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1408.518896] nova-compute[62208]: DEBUG nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1408.542022] nova-compute[62208]: DEBUG nova.virt.hardware [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1408.542314] nova-compute[62208]: DEBUG nova.virt.hardware [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1408.542432] nova-compute[62208]: DEBUG nova.virt.hardware [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1408.542614] nova-compute[62208]: DEBUG nova.virt.hardware [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1408.542762] nova-compute[62208]: DEBUG nova.virt.hardware [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1408.542911] nova-compute[62208]: DEBUG nova.virt.hardware [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1408.543123] nova-compute[62208]: DEBUG nova.virt.hardware [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1408.543284] nova-compute[62208]: DEBUG nova.virt.hardware [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1408.543473] nova-compute[62208]: DEBUG nova.virt.hardware [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1408.543603] nova-compute[62208]: DEBUG nova.virt.hardware [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1408.543756] nova-compute[62208]: DEBUG nova.virt.hardware [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1408.544714] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f31f0c5-2886-4862-8931-95fa6562f8fc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1408.547304] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1408.547304] nova-compute[62208]: warnings.warn( [ 1408.553257] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7281ff7e-9717-4266-bf5b-4e968e9a048d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1408.557895] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1408.557895] nova-compute[62208]: warnings.warn( [ 1408.856799] nova-compute[62208]: DEBUG nova.network.neutron [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Successfully created port: 80131c19-baa0-4896-9681-6b5a4d061684 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1409.178836] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-dbaf8a69-13eb-4b0f-9f97-d6be8b8ba1da tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "f000e638-100f-4a53-853d-4a94ffe71bed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1409.516851] nova-compute[62208]: DEBUG nova.network.neutron [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Successfully updated port: 80131c19-baa0-4896-9681-6b5a4d061684 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1409.529677] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "refresh_cache-ec31fb88-38c6-400d-b1ec-c93af711a1f6" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1409.529795] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquired lock "refresh_cache-ec31fb88-38c6-400d-b1ec-c93af711a1f6" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1409.529921] nova-compute[62208]: DEBUG nova.network.neutron [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1409.598085] nova-compute[62208]: DEBUG nova.network.neutron [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1409.769438] nova-compute[62208]: DEBUG nova.network.neutron [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Updating instance_info_cache with network_info: [{"id": "80131c19-baa0-4896-9681-6b5a4d061684", "address": "fa:16:3e:72:94:0f", "network": {"id": "4ff34335-1f93-40d4-a4ee-e581b57a773a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1917389442-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7a607d976ab14539a9d204c9437c3522", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f5fe645c-e088-401e-ab53-4ae2981dea72", "external-id": "nsx-vlan-transportzone-219", "segmentation_id": 219, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap80131c19-ba", "ovs_interfaceid": "80131c19-baa0-4896-9681-6b5a4d061684", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1409.776452] nova-compute[62208]: DEBUG nova.compute.manager [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Received event network-vif-plugged-80131c19-baa0-4896-9681-6b5a4d061684 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1409.776452] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] Acquiring lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1409.776452] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] Lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1409.776641] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] Lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1409.776727] nova-compute[62208]: DEBUG nova.compute.manager [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] No waiting events found dispatching network-vif-plugged-80131c19-baa0-4896-9681-6b5a4d061684 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1409.777026] nova-compute[62208]: WARNING nova.compute.manager [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Received unexpected event network-vif-plugged-80131c19-baa0-4896-9681-6b5a4d061684 for instance with vm_state building and task_state spawning. [ 1409.777026] nova-compute[62208]: DEBUG nova.compute.manager [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Received event network-changed-80131c19-baa0-4896-9681-6b5a4d061684 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1409.777185] nova-compute[62208]: DEBUG nova.compute.manager [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Refreshing instance network info cache due to event network-changed-80131c19-baa0-4896-9681-6b5a4d061684. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1409.777320] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] Acquiring lock "refresh_cache-ec31fb88-38c6-400d-b1ec-c93af711a1f6" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1409.786056] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Releasing lock "refresh_cache-ec31fb88-38c6-400d-b1ec-c93af711a1f6" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1409.786351] nova-compute[62208]: DEBUG nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Instance network_info: |[{"id": "80131c19-baa0-4896-9681-6b5a4d061684", "address": "fa:16:3e:72:94:0f", "network": {"id": "4ff34335-1f93-40d4-a4ee-e581b57a773a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1917389442-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7a607d976ab14539a9d204c9437c3522", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f5fe645c-e088-401e-ab53-4ae2981dea72", "external-id": "nsx-vlan-transportzone-219", "segmentation_id": 219, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap80131c19-ba", "ovs_interfaceid": "80131c19-baa0-4896-9681-6b5a4d061684", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1409.786678] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] Acquired lock "refresh_cache-ec31fb88-38c6-400d-b1ec-c93af711a1f6" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1409.786857] nova-compute[62208]: DEBUG nova.network.neutron [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Refreshing network info cache for port 80131c19-baa0-4896-9681-6b5a4d061684 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1409.787980] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:72:94:0f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f5fe645c-e088-401e-ab53-4ae2981dea72', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '80131c19-baa0-4896-9681-6b5a4d061684', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1409.796333] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Creating folder: Project (7a607d976ab14539a9d204c9437c3522). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1409.797673] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6ea598ba-cea1-4151-a18f-ea14437ddd7a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1409.801733] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1409.801733] nova-compute[62208]: warnings.warn( [ 1409.812242] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Created folder: Project (7a607d976ab14539a9d204c9437c3522) in parent group-v17427. [ 1409.812442] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Creating folder: Instances. Parent ref: group-v17543. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1409.812663] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-54f4ef49-793e-4080-a397-050dac108439 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1409.814752] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1409.814752] nova-compute[62208]: warnings.warn( [ 1409.823259] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Created folder: Instances in parent group-v17543. [ 1409.823496] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1409.823686] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1409.823891] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-574c0dd9-d3e1-4d00-8367-a8d9d8643b96 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1409.838536] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1409.838536] nova-compute[62208]: warnings.warn( [ 1409.844417] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1409.844417] nova-compute[62208]: value = "task-38594" [ 1409.844417] nova-compute[62208]: _type = "Task" [ 1409.844417] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1409.847863] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1409.847863] nova-compute[62208]: warnings.warn( [ 1409.853528] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38594, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1410.117853] nova-compute[62208]: DEBUG nova.network.neutron [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Updated VIF entry in instance network info cache for port 80131c19-baa0-4896-9681-6b5a4d061684. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1410.118335] nova-compute[62208]: DEBUG nova.network.neutron [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Updating instance_info_cache with network_info: [{"id": "80131c19-baa0-4896-9681-6b5a4d061684", "address": "fa:16:3e:72:94:0f", "network": {"id": "4ff34335-1f93-40d4-a4ee-e581b57a773a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1917389442-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7a607d976ab14539a9d204c9437c3522", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f5fe645c-e088-401e-ab53-4ae2981dea72", "external-id": "nsx-vlan-transportzone-219", "segmentation_id": 219, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap80131c19-ba", "ovs_interfaceid": "80131c19-baa0-4896-9681-6b5a4d061684", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1410.131820] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-dbc0ac8d-1f45-414d-9dca-7d606da99d81 req-3cb73b53-897a-4509-b729-62ebd1288fcc service nova] Releasing lock "refresh_cache-ec31fb88-38c6-400d-b1ec-c93af711a1f6" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1410.348736] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1410.348736] nova-compute[62208]: warnings.warn( [ 1410.354793] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38594, 'name': CreateVM_Task, 'duration_secs': 0.303876} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1410.354966] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1410.355556] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1410.355773] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1410.358677] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6539a05c-ff46-4761-85f3-39c39931dd61 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.368829] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1410.368829] nova-compute[62208]: warnings.warn( [ 1410.391708] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Reconfiguring VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1410.392427] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-6bb55c95-b051-4ab7-9d79-c69d75dcd2e6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.403500] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1410.403500] nova-compute[62208]: warnings.warn( [ 1410.409532] nova-compute[62208]: DEBUG oslo_vmware.api [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for the task: (returnval){ [ 1410.409532] nova-compute[62208]: value = "task-38595" [ 1410.409532] nova-compute[62208]: _type = "Task" [ 1410.409532] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1410.412548] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1410.412548] nova-compute[62208]: warnings.warn( [ 1410.418200] nova-compute[62208]: DEBUG oslo_vmware.api [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Task: {'id': task-38595, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1410.913726] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1410.913726] nova-compute[62208]: warnings.warn( [ 1410.919844] nova-compute[62208]: DEBUG oslo_vmware.api [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Task: {'id': task-38595, 'name': ReconfigVM_Task, 'duration_secs': 0.099276} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1410.920135] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Reconfigured VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1410.920354] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.565s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1410.920600] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1410.920743] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1410.921102] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1410.921383] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-44efff21-adec-4cf4-8773-6c2ce9c215f5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.923009] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1410.923009] nova-compute[62208]: warnings.warn( [ 1410.926236] nova-compute[62208]: DEBUG oslo_vmware.api [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for the task: (returnval){ [ 1410.926236] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5236d023-d9d8-99d7-8736-9137eec1f1ed" [ 1410.926236] nova-compute[62208]: _type = "Task" [ 1410.926236] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1410.929351] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1410.929351] nova-compute[62208]: warnings.warn( [ 1410.934324] nova-compute[62208]: DEBUG oslo_vmware.api [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5236d023-d9d8-99d7-8736-9137eec1f1ed, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1411.430429] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1411.430429] nova-compute[62208]: warnings.warn( [ 1411.437064] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1411.437325] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1411.437531] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1411.545547] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f572a83e-f23b-4048-8b28-60c604e2b03e tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1414.065238] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10dde76c-fe12-4a4d-af27-f16ee2d244e9 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1419.140919] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1421.141206] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1421.141577] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11219}} [ 1421.151554] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] There are 0 instances to clean {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11228}} [ 1423.151294] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1424.140780] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1424.140966] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1424.141090] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1424.161366] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1424.161699] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1424.161699] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1424.161770] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1424.161881] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1424.162003] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1424.162121] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1424.162239] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1424.162355] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1424.162470] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1424.162587] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1424.163091] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1424.163295] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1424.163426] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances with incomplete migration {{(pid=62208) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11257}} [ 1425.149655] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1425.149924] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1426.141507] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1426.152022] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1426.152300] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1426.152473] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1426.152630] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1426.153742] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-813c2b48-60f2-4e4d-8918-dabf72fae6fe {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.156926] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1426.156926] nova-compute[62208]: warnings.warn( [ 1426.163484] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1892226-ad34-4cfe-96a6-1cccb10fed26 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.166837] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1426.166837] nova-compute[62208]: warnings.warn( [ 1426.177184] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d81fa2ae-4dda-4acb-8456-7d75d7629967 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.179371] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1426.179371] nova-compute[62208]: warnings.warn( [ 1426.183572] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95008129-7d2c-4707-a5b4-29ba32e6b0e4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.186610] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1426.186610] nova-compute[62208]: warnings.warn( [ 1426.214135] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181890MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1426.214315] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1426.214495] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1426.368566] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d1a22d6e-d913-47de-9188-507d2475f745 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1426.368815] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7f79eba6-e15c-4402-b46b-028d552a81d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1426.368884] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8f74dd97-b43c-49ef-8a83-401329ebfbdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1426.368979] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1426.369098] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c0d7e5a6-e905-47ee-87d7-cda8543be1f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1426.369214] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4d29dc3e-1090-49fd-83b7-96b8e6855ede actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1426.369329] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e45ad927-7d07-43d5-84b8-339c68981de6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1426.369442] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f000e638-100f-4a53-853d-4a94ffe71bed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1426.369554] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bbb642f2-aa5c-4e71-b25b-e32acd45f879 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1426.369665] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1426.381748] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance b7855bc3-7f66-4755-b8bb-82604ae49df5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1426.393062] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 47cd2de6-8094-452e-afd7-aa42128a1b0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1426.402850] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1426.412914] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ad00920b-3783-4c01-bb25-4f923d29dad7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1426.413181] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1426.413333] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1426.429809] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing inventories for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1426.444216] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating ProviderTree inventory for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1426.444446] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating inventory in ProviderTree for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1426.455868] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing aggregate associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, aggregates: None {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1426.474066] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing trait associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1426.650583] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f303b57-df19-4d57-8fcc-1f18112799a9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.653086] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1426.653086] nova-compute[62208]: warnings.warn( [ 1426.658676] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e70e03c0-c7c2-448d-a646-87946924685e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.661532] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1426.661532] nova-compute[62208]: warnings.warn( [ 1426.688770] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f4ac478-a6a9-4d32-8f94-3642ecddd950 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.691196] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1426.691196] nova-compute[62208]: warnings.warn( [ 1426.696586] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a172215e-7e64-4912-9565-d46129849518 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.701202] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1426.701202] nova-compute[62208]: warnings.warn( [ 1426.710646] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1426.718860] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1426.734116] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1426.734306] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.520s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1429.734707] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1429.734954] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1429.735098] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1436.142439] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1455.502956] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1455.502956] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1455.502956] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1455.502956] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1455.502956] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1455.502956] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1455.502956] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1455.502956] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1455.502956] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1455.502956] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1455.502956] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1455.502956] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1455.503854] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/fabae672-e0dc-45b1-9ab3-34495e136b0f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1455.505501] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1455.505776] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Copying Virtual Disk [datastore2] vmware_temp/fabae672-e0dc-45b1-9ab3-34495e136b0f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/fabae672-e0dc-45b1-9ab3-34495e136b0f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1455.506479] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-bc87548e-a084-41ea-91c2-7e1bef4fc714 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1455.509581] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1455.509581] nova-compute[62208]: warnings.warn( [ 1455.515374] nova-compute[62208]: DEBUG oslo_vmware.api [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Waiting for the task: (returnval){ [ 1455.515374] nova-compute[62208]: value = "task-38596" [ 1455.515374] nova-compute[62208]: _type = "Task" [ 1455.515374] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1455.519146] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1455.519146] nova-compute[62208]: warnings.warn( [ 1455.524573] nova-compute[62208]: DEBUG oslo_vmware.api [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Task: {'id': task-38596, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1456.019968] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.019968] nova-compute[62208]: warnings.warn( [ 1456.025616] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1456.025943] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1456.026518] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] Traceback (most recent call last): [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] yield resources [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] self.driver.spawn(context, instance, image_meta, [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] self._fetch_image_if_missing(context, vi) [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] image_cache(vi, tmp_image_ds_loc) [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] vm_util.copy_virtual_disk( [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] session._wait_for_task(vmdk_copy_task) [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] return self.wait_for_task(task_ref) [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] return evt.wait() [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] result = hub.switch() [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] return self.greenlet.switch() [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] self.f(*self.args, **self.kw) [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] raise exceptions.translate_fault(task_info.error) [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] Faults: ['InvalidArgument'] [ 1456.026518] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] [ 1456.027568] nova-compute[62208]: INFO nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Terminating instance [ 1456.028466] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1456.028673] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1456.028919] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ac5bd567-a4a4-4e8f-b248-0cc3404acb23 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.031338] nova-compute[62208]: DEBUG nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1456.031535] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1456.032331] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9930bba-1360-4d4b-8b92-15cd0144fbe8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.034570] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.034570] nova-compute[62208]: warnings.warn( [ 1456.034939] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.034939] nova-compute[62208]: warnings.warn( [ 1456.039481] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1456.039736] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0e7403d4-0e24-40c3-947f-925a43240907 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.042265] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1456.042436] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1456.043014] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.043014] nova-compute[62208]: warnings.warn( [ 1456.043435] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-555e80a2-eb87-4833-8486-275424b8cd96 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.045680] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.045680] nova-compute[62208]: warnings.warn( [ 1456.048697] nova-compute[62208]: DEBUG oslo_vmware.api [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 1456.048697] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52b7aa22-b0da-2a29-23c8-6b965a0d3871" [ 1456.048697] nova-compute[62208]: _type = "Task" [ 1456.048697] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1456.051608] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.051608] nova-compute[62208]: warnings.warn( [ 1456.057082] nova-compute[62208]: DEBUG oslo_vmware.api [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52b7aa22-b0da-2a29-23c8-6b965a0d3871, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1456.109722] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1456.110008] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1456.110208] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Deleting the datastore file [datastore2] d1a22d6e-d913-47de-9188-507d2475f745 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1456.110500] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ad93ac2d-a88f-417a-b50a-f14196eed4b7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.112725] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.112725] nova-compute[62208]: warnings.warn( [ 1456.118286] nova-compute[62208]: DEBUG oslo_vmware.api [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Waiting for the task: (returnval){ [ 1456.118286] nova-compute[62208]: value = "task-38598" [ 1456.118286] nova-compute[62208]: _type = "Task" [ 1456.118286] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1456.122128] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.122128] nova-compute[62208]: warnings.warn( [ 1456.127313] nova-compute[62208]: DEBUG oslo_vmware.api [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Task: {'id': task-38598, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1456.553583] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.553583] nova-compute[62208]: warnings.warn( [ 1456.560157] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1456.560431] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Creating directory with path [datastore2] vmware_temp/2dda6f43-59de-4f60-a22f-dc21b92b458e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1456.560668] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ef822354-389b-4503-942c-5fa65b0a629a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.562700] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.562700] nova-compute[62208]: warnings.warn( [ 1456.573594] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Created directory with path [datastore2] vmware_temp/2dda6f43-59de-4f60-a22f-dc21b92b458e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1456.573926] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Fetch image to [datastore2] vmware_temp/2dda6f43-59de-4f60-a22f-dc21b92b458e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1456.574194] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/2dda6f43-59de-4f60-a22f-dc21b92b458e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1456.575048] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79dfc431-4c10-4802-8be5-e19626b26695 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.577914] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.577914] nova-compute[62208]: warnings.warn( [ 1456.584490] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18d50a1d-4487-4b56-a290-a77995d7bdd9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.587146] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.587146] nova-compute[62208]: warnings.warn( [ 1456.596225] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a40039e4-4b98-43bd-b841-bb2b7d12a712 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.600365] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.600365] nova-compute[62208]: warnings.warn( [ 1456.631936] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5869bc3-0d94-484e-975b-d7d0538b02e2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.633796] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.633796] nova-compute[62208]: warnings.warn( [ 1456.634232] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.634232] nova-compute[62208]: warnings.warn( [ 1456.640478] nova-compute[62208]: DEBUG oslo_vmware.api [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Task: {'id': task-38598, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081492} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1456.641979] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1456.642201] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1456.642381] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1456.642556] nova-compute[62208]: INFO nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1456.645369] nova-compute[62208]: DEBUG nova.compute.claims [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936574490> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1456.645538] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1456.645767] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1456.648654] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2babb3a5-8f88-41d4-8b02-4a5bb97ceef2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.650414] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.650414] nova-compute[62208]: warnings.warn( [ 1456.672060] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1456.732899] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2dda6f43-59de-4f60-a22f-dc21b92b458e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1456.800371] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1456.800673] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2dda6f43-59de-4f60-a22f-dc21b92b458e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1456.923741] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-985ef1f5-6e12-4aff-addd-ca1632c1ed06 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.927164] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.927164] nova-compute[62208]: warnings.warn( [ 1456.932192] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a8fa3f4-57a3-45c5-b57c-21185aaa1ae6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.935080] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.935080] nova-compute[62208]: warnings.warn( [ 1456.962238] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-949b5c3f-da79-4c2a-86ea-f148f8383322 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.964658] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.964658] nova-compute[62208]: warnings.warn( [ 1456.969845] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-520f8a2e-864c-4e95-93cd-b98e4b46fd94 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1456.973442] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1456.973442] nova-compute[62208]: warnings.warn( [ 1456.983087] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1456.992301] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1457.010392] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.364s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1457.010955] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] Traceback (most recent call last): [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] self.driver.spawn(context, instance, image_meta, [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] self._fetch_image_if_missing(context, vi) [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] image_cache(vi, tmp_image_ds_loc) [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] vm_util.copy_virtual_disk( [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] session._wait_for_task(vmdk_copy_task) [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] return self.wait_for_task(task_ref) [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] return evt.wait() [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] result = hub.switch() [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] return self.greenlet.switch() [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] self.f(*self.args, **self.kw) [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] raise exceptions.translate_fault(task_info.error) [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] Faults: ['InvalidArgument'] [ 1457.010955] nova-compute[62208]: ERROR nova.compute.manager [instance: d1a22d6e-d913-47de-9188-507d2475f745] [ 1457.011886] nova-compute[62208]: DEBUG nova.compute.utils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1457.013692] nova-compute[62208]: DEBUG nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Build of instance d1a22d6e-d913-47de-9188-507d2475f745 was re-scheduled: A specified parameter was not correct: fileType [ 1457.013692] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1457.014081] nova-compute[62208]: DEBUG nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1457.014255] nova-compute[62208]: DEBUG nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1457.014425] nova-compute[62208]: DEBUG nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1457.014613] nova-compute[62208]: DEBUG nova.network.neutron [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1457.388949] nova-compute[62208]: DEBUG nova.network.neutron [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1457.403534] nova-compute[62208]: INFO nova.compute.manager [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Took 0.39 seconds to deallocate network for instance. [ 1457.511890] nova-compute[62208]: INFO nova.scheduler.client.report [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Deleted allocations for instance d1a22d6e-d913-47de-9188-507d2475f745 [ 1457.532095] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-63eb822a-63ac-4d21-a5f4-d61859f4792c tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Lock "d1a22d6e-d913-47de-9188-507d2475f745" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 538.709s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1457.533268] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-61f0aa73-4513-4a6d-83d8-cdcbbaf13411 tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Lock "d1a22d6e-d913-47de-9188-507d2475f745" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 342.067s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1457.533485] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-61f0aa73-4513-4a6d-83d8-cdcbbaf13411 tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Acquiring lock "d1a22d6e-d913-47de-9188-507d2475f745-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1457.533695] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-61f0aa73-4513-4a6d-83d8-cdcbbaf13411 tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Lock "d1a22d6e-d913-47de-9188-507d2475f745-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1457.533843] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-61f0aa73-4513-4a6d-83d8-cdcbbaf13411 tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Lock "d1a22d6e-d913-47de-9188-507d2475f745-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1457.535909] nova-compute[62208]: INFO nova.compute.manager [None req-61f0aa73-4513-4a6d-83d8-cdcbbaf13411 tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Terminating instance [ 1457.537992] nova-compute[62208]: DEBUG nova.compute.manager [None req-61f0aa73-4513-4a6d-83d8-cdcbbaf13411 tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1457.538187] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-61f0aa73-4513-4a6d-83d8-cdcbbaf13411 tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1457.538668] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cc8ae1c6-1b3e-44dc-aaaa-1a6f4bd17ccb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1457.540826] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1457.540826] nova-compute[62208]: warnings.warn( [ 1457.548662] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e0e9587-642c-4848-99c6-64fb441a0ffe {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1457.560021] nova-compute[62208]: DEBUG nova.compute.manager [None req-2f1ef51f-3ee9-45d8-9653-3c39d9d37dd5 tempest-ServerPasswordTestJSON-653179953 tempest-ServerPasswordTestJSON-653179953-project-member] [instance: 795665a3-58eb-4d8a-bedc-84e399e11bb7] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1457.562424] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1457.562424] nova-compute[62208]: warnings.warn( [ 1457.586522] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-61f0aa73-4513-4a6d-83d8-cdcbbaf13411 tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d1a22d6e-d913-47de-9188-507d2475f745 could not be found. [ 1457.586730] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-61f0aa73-4513-4a6d-83d8-cdcbbaf13411 tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1457.586924] nova-compute[62208]: INFO nova.compute.manager [None req-61f0aa73-4513-4a6d-83d8-cdcbbaf13411 tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1457.587183] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-61f0aa73-4513-4a6d-83d8-cdcbbaf13411 tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1457.587424] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1457.587526] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: d1a22d6e-d913-47de-9188-507d2475f745] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1457.594847] nova-compute[62208]: DEBUG nova.compute.manager [None req-2f1ef51f-3ee9-45d8-9653-3c39d9d37dd5 tempest-ServerPasswordTestJSON-653179953 tempest-ServerPasswordTestJSON-653179953-project-member] [instance: 795665a3-58eb-4d8a-bedc-84e399e11bb7] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1457.626356] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1457.628736] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2f1ef51f-3ee9-45d8-9653-3c39d9d37dd5 tempest-ServerPasswordTestJSON-653179953 tempest-ServerPasswordTestJSON-653179953-project-member] Lock "795665a3-58eb-4d8a-bedc-84e399e11bb7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 237.596s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1457.638563] nova-compute[62208]: INFO nova.compute.manager [-] [instance: d1a22d6e-d913-47de-9188-507d2475f745] Took 0.05 seconds to deallocate network for instance. [ 1457.647711] nova-compute[62208]: DEBUG nova.compute.manager [None req-8e075d5e-0318-4727-bff8-d9a0320a1e9f tempest-ServerAddressesTestJSON-433221683 tempest-ServerAddressesTestJSON-433221683-project-member] [instance: a72bbc63-b475-4be7-a412-f0f893a094f4] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1457.676458] nova-compute[62208]: DEBUG nova.compute.manager [None req-8e075d5e-0318-4727-bff8-d9a0320a1e9f tempest-ServerAddressesTestJSON-433221683 tempest-ServerAddressesTestJSON-433221683-project-member] [instance: a72bbc63-b475-4be7-a412-f0f893a094f4] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1457.722881] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8e075d5e-0318-4727-bff8-d9a0320a1e9f tempest-ServerAddressesTestJSON-433221683 tempest-ServerAddressesTestJSON-433221683-project-member] Lock "a72bbc63-b475-4be7-a412-f0f893a094f4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 235.590s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1457.736185] nova-compute[62208]: DEBUG nova.compute.manager [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1457.763598] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-61f0aa73-4513-4a6d-83d8-cdcbbaf13411 tempest-ImagesOneServerTestJSON-549668524 tempest-ImagesOneServerTestJSON-549668524-project-member] Lock "d1a22d6e-d913-47de-9188-507d2475f745" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.230s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1457.764514] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "d1a22d6e-d913-47de-9188-507d2475f745" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 283.341s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1457.764621] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d1a22d6e-d913-47de-9188-507d2475f745] During sync_power_state the instance has a pending task (deleting). Skip. [ 1457.764780] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "d1a22d6e-d913-47de-9188-507d2475f745" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1457.797753] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1457.798068] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1457.799537] nova-compute[62208]: INFO nova.compute.claims [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1457.923003] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "b7855bc3-7f66-4755-b8bb-82604ae49df5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1458.034856] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b71338b9-7663-4f90-b96e-ff6a922ecec4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.038382] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1458.038382] nova-compute[62208]: warnings.warn( [ 1458.044081] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cb34768-f8e1-4176-aee6-7147faf1440e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.047511] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1458.047511] nova-compute[62208]: warnings.warn( [ 1458.075843] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c3c9a00-1120-446c-8035-991fb1a86366 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.078610] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1458.078610] nova-compute[62208]: warnings.warn( [ 1458.086181] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-710b03e7-4a38-464f-8c48-e7efac442624 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.091351] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1458.091351] nova-compute[62208]: warnings.warn( [ 1458.102000] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1458.111143] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1458.130345] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.332s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1458.131002] nova-compute[62208]: DEBUG nova.compute.manager [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1458.165643] nova-compute[62208]: DEBUG nova.compute.claims [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9362e9480> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1458.165902] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1458.166219] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1458.368181] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a99522d-d444-4a1d-a157-521254740565 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.370736] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1458.370736] nova-compute[62208]: warnings.warn( [ 1458.375442] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a686573-8fe0-4bc3-9233-683aefeba8c7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.378316] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1458.378316] nova-compute[62208]: warnings.warn( [ 1458.404882] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49ebed5f-5be7-4c8d-925b-4b1c9e7da008 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.407432] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1458.407432] nova-compute[62208]: warnings.warn( [ 1458.412961] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68399289-b1fa-4c56-9ec6-4f4c80462df7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.416665] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1458.416665] nova-compute[62208]: warnings.warn( [ 1458.426334] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1458.434973] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1458.450641] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.284s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1458.451406] nova-compute[62208]: DEBUG nova.compute.utils [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Conflict updating instance b7855bc3-7f66-4755-b8bb-82604ae49df5. Expected: {'task_state': [None]}. Actual: {'task_state': 'deleting'} {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1458.452820] nova-compute[62208]: DEBUG nova.compute.manager [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Instance disappeared during build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2504}} [ 1458.452993] nova-compute[62208]: DEBUG nova.compute.manager [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1458.453211] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "refresh_cache-b7855bc3-7f66-4755-b8bb-82604ae49df5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1458.453377] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "refresh_cache-b7855bc3-7f66-4755-b8bb-82604ae49df5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1458.453510] nova-compute[62208]: DEBUG nova.network.neutron [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1458.482941] nova-compute[62208]: DEBUG nova.network.neutron [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1458.548569] nova-compute[62208]: DEBUG nova.network.neutron [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1458.558489] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "refresh_cache-b7855bc3-7f66-4755-b8bb-82604ae49df5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1458.558707] nova-compute[62208]: DEBUG nova.compute.manager [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1458.558902] nova-compute[62208]: DEBUG nova.compute.manager [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1458.559077] nova-compute[62208]: DEBUG nova.network.neutron [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1458.577197] nova-compute[62208]: DEBUG nova.network.neutron [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1458.585785] nova-compute[62208]: DEBUG nova.network.neutron [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1458.594484] nova-compute[62208]: INFO nova.compute.manager [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Took 0.04 seconds to deallocate network for instance. [ 1458.675676] nova-compute[62208]: INFO nova.scheduler.client.report [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Deleted allocations for instance b7855bc3-7f66-4755-b8bb-82604ae49df5 [ 1458.676117] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-9daa466a-8b4a-4eb8-81f8-b2daf7326011 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "b7855bc3-7f66-4755-b8bb-82604ae49df5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 197.081s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1458.677415] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "b7855bc3-7f66-4755-b8bb-82604ae49df5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.755s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1458.677650] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "b7855bc3-7f66-4755-b8bb-82604ae49df5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1458.678300] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "b7855bc3-7f66-4755-b8bb-82604ae49df5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1458.678495] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "b7855bc3-7f66-4755-b8bb-82604ae49df5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1458.680663] nova-compute[62208]: INFO nova.compute.manager [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Terminating instance [ 1458.682299] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "refresh_cache-b7855bc3-7f66-4755-b8bb-82604ae49df5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1458.682482] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "refresh_cache-b7855bc3-7f66-4755-b8bb-82604ae49df5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1458.682678] nova-compute[62208]: DEBUG nova.network.neutron [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1458.690410] nova-compute[62208]: DEBUG nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1458.718055] nova-compute[62208]: DEBUG nova.network.neutron [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1458.748047] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1458.748317] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1458.749799] nova-compute[62208]: INFO nova.compute.claims [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1458.776628] nova-compute[62208]: DEBUG nova.network.neutron [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1458.788300] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "refresh_cache-b7855bc3-7f66-4755-b8bb-82604ae49df5" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1458.788652] nova-compute[62208]: DEBUG nova.compute.manager [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1458.788836] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1458.789331] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f1f2a213-0d32-459f-a790-9abba83fc898 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.791882] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1458.791882] nova-compute[62208]: warnings.warn( [ 1458.802668] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8006b5ed-fe4f-4bf1-95d0-1cf34faaaf75 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.815081] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1458.815081] nova-compute[62208]: warnings.warn( [ 1458.834434] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b7855bc3-7f66-4755-b8bb-82604ae49df5 could not be found. [ 1458.834628] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1458.834939] nova-compute[62208]: INFO nova.compute.manager [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1458.835257] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1458.839015] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1458.839193] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1458.862245] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1458.869599] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1458.880779] nova-compute[62208]: INFO nova.compute.manager [-] [instance: b7855bc3-7f66-4755-b8bb-82604ae49df5] Took 0.04 seconds to deallocate network for instance. [ 1459.006479] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-edb6ee49-7bec-4ca8-98bb-174a40547dc3 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "b7855bc3-7f66-4755-b8bb-82604ae49df5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.329s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.008377] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dbbd8ab-773b-43f8-af1b-c14f7637e166 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.012489] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1459.012489] nova-compute[62208]: warnings.warn( [ 1459.020035] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fecae5b-c687-482a-89fd-c82cd0f0dff4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.024024] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1459.024024] nova-compute[62208]: warnings.warn( [ 1459.053360] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f8963cf-9c04-4449-9e85-c12eabb8f9d5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.056118] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1459.056118] nova-compute[62208]: warnings.warn( [ 1459.061788] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8617d5ca-4059-4fe5-aa0e-aa11117350ac {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.066433] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1459.066433] nova-compute[62208]: warnings.warn( [ 1459.077750] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1459.086691] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1459.106287] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.358s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.106816] nova-compute[62208]: DEBUG nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1459.144022] nova-compute[62208]: DEBUG nova.compute.utils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1459.145325] nova-compute[62208]: DEBUG nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1459.145507] nova-compute[62208]: DEBUG nova.network.neutron [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1459.157002] nova-compute[62208]: DEBUG nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1459.190923] nova-compute[62208]: DEBUG nova.policy [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ebce6a5046ee4205872b7e53550d3af0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05e12525b24d49cf8d295b53a9aeefca', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1459.230427] nova-compute[62208]: DEBUG nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1459.253125] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1459.253467] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1459.253616] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1459.253834] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1459.254013] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1459.254258] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1459.254504] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1459.254713] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1459.255052] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1459.255273] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1459.255484] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1459.256428] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e4ec553-c8f0-4532-a115-11cd4c929a91 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.259186] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1459.259186] nova-compute[62208]: warnings.warn( [ 1459.265089] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-203a3177-9784-415b-a1a7-687ff6843667 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.269227] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1459.269227] nova-compute[62208]: warnings.warn( [ 1459.515234] nova-compute[62208]: DEBUG nova.network.neutron [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Successfully created port: 17e02792-762e-43fb-88ac-ed1f53e7aa90 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1460.404613] nova-compute[62208]: DEBUG nova.compute.manager [req-466f53fd-c168-4c9e-8dcc-442a7ff5cd38 req-0798e109-c0de-45df-b263-41fa53e1b3f0 service nova] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Received event network-vif-plugged-17e02792-762e-43fb-88ac-ed1f53e7aa90 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1460.404909] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-466f53fd-c168-4c9e-8dcc-442a7ff5cd38 req-0798e109-c0de-45df-b263-41fa53e1b3f0 service nova] Acquiring lock "47cd2de6-8094-452e-afd7-aa42128a1b0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1460.405044] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-466f53fd-c168-4c9e-8dcc-442a7ff5cd38 req-0798e109-c0de-45df-b263-41fa53e1b3f0 service nova] Lock "47cd2de6-8094-452e-afd7-aa42128a1b0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1460.405214] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-466f53fd-c168-4c9e-8dcc-442a7ff5cd38 req-0798e109-c0de-45df-b263-41fa53e1b3f0 service nova] Lock "47cd2de6-8094-452e-afd7-aa42128a1b0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1460.405382] nova-compute[62208]: DEBUG nova.compute.manager [req-466f53fd-c168-4c9e-8dcc-442a7ff5cd38 req-0798e109-c0de-45df-b263-41fa53e1b3f0 service nova] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] No waiting events found dispatching network-vif-plugged-17e02792-762e-43fb-88ac-ed1f53e7aa90 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1460.405583] nova-compute[62208]: WARNING nova.compute.manager [req-466f53fd-c168-4c9e-8dcc-442a7ff5cd38 req-0798e109-c0de-45df-b263-41fa53e1b3f0 service nova] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Received unexpected event network-vif-plugged-17e02792-762e-43fb-88ac-ed1f53e7aa90 for instance with vm_state building and task_state spawning. [ 1460.468816] nova-compute[62208]: DEBUG nova.network.neutron [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Successfully updated port: 17e02792-762e-43fb-88ac-ed1f53e7aa90 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1460.481216] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Acquiring lock "refresh_cache-47cd2de6-8094-452e-afd7-aa42128a1b0c" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1460.481458] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Acquired lock "refresh_cache-47cd2de6-8094-452e-afd7-aa42128a1b0c" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1460.481633] nova-compute[62208]: DEBUG nova.network.neutron [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1460.554525] nova-compute[62208]: DEBUG nova.network.neutron [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1460.810258] nova-compute[62208]: DEBUG nova.network.neutron [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Updating instance_info_cache with network_info: [{"id": "17e02792-762e-43fb-88ac-ed1f53e7aa90", "address": "fa:16:3e:96:7a:17", "network": {"id": "65109d25-d14a-4a19-93e6-98aec089cff0", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-977375263-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "05e12525b24d49cf8d295b53a9aeefca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccf76700-491b-4462-ab19-e6d3a9ff87ac", "external-id": "nsx-vlan-transportzone-956", "segmentation_id": 956, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap17e02792-76", "ovs_interfaceid": "17e02792-762e-43fb-88ac-ed1f53e7aa90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1460.823776] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Releasing lock "refresh_cache-47cd2de6-8094-452e-afd7-aa42128a1b0c" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1460.824116] nova-compute[62208]: DEBUG nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Instance network_info: |[{"id": "17e02792-762e-43fb-88ac-ed1f53e7aa90", "address": "fa:16:3e:96:7a:17", "network": {"id": "65109d25-d14a-4a19-93e6-98aec089cff0", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-977375263-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "05e12525b24d49cf8d295b53a9aeefca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccf76700-491b-4462-ab19-e6d3a9ff87ac", "external-id": "nsx-vlan-transportzone-956", "segmentation_id": 956, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap17e02792-76", "ovs_interfaceid": "17e02792-762e-43fb-88ac-ed1f53e7aa90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1460.824543] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:96:7a:17', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ccf76700-491b-4462-ab19-e6d3a9ff87ac', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '17e02792-762e-43fb-88ac-ed1f53e7aa90', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1460.831944] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Creating folder: Project (05e12525b24d49cf8d295b53a9aeefca). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1460.832615] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3a52702e-db28-41ff-b3e9-01567edf56f6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.834944] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1460.834944] nova-compute[62208]: warnings.warn( [ 1460.844990] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Created folder: Project (05e12525b24d49cf8d295b53a9aeefca) in parent group-v17427. [ 1460.845406] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Creating folder: Instances. Parent ref: group-v17546. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1460.845692] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a20d410e-b533-4e62-a935-c4ba4b9fa3cb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.847298] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1460.847298] nova-compute[62208]: warnings.warn( [ 1460.856439] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Created folder: Instances in parent group-v17546. [ 1460.856699] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1460.856898] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1460.857207] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b13aa28a-4139-4e51-a204-ffbacf8f6df8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.873225] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1460.873225] nova-compute[62208]: warnings.warn( [ 1460.879812] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1460.879812] nova-compute[62208]: value = "task-38601" [ 1460.879812] nova-compute[62208]: _type = "Task" [ 1460.879812] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1460.883802] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1460.883802] nova-compute[62208]: warnings.warn( [ 1460.892862] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1460.893145] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1460.893316] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38601, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1461.384193] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1461.384193] nova-compute[62208]: warnings.warn( [ 1461.390898] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38601, 'name': CreateVM_Task, 'duration_secs': 0.343091} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1461.391088] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1461.392475] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1461.392796] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1461.395755] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f8d730b-182a-4964-bc0c-30d6cd938ae2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1461.407583] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1461.407583] nova-compute[62208]: warnings.warn( [ 1461.433497] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Reconfiguring VM instance to enable vnc on port - 5909 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1461.433960] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-f27d8595-95d9-41c7-b908-afa215af2e97 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1461.444730] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1461.444730] nova-compute[62208]: warnings.warn( [ 1461.452121] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Waiting for the task: (returnval){ [ 1461.452121] nova-compute[62208]: value = "task-38602" [ 1461.452121] nova-compute[62208]: _type = "Task" [ 1461.452121] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1461.455275] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1461.455275] nova-compute[62208]: warnings.warn( [ 1461.462753] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Task: {'id': task-38602, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1461.955900] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1461.955900] nova-compute[62208]: warnings.warn( [ 1461.962010] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Task: {'id': task-38602, 'name': ReconfigVM_Task, 'duration_secs': 0.111787} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1461.962287] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Reconfigured VM instance to enable vnc on port - 5909 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1461.962524] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.570s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1461.962772] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1461.962920] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1461.963253] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1461.963505] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6d4bc024-3b97-4ca8-a6a9-c386e7d433ef {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1461.965045] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1461.965045] nova-compute[62208]: warnings.warn( [ 1461.968362] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Waiting for the task: (returnval){ [ 1461.968362] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f6911a-835e-b448-55e9-51200a6d10a2" [ 1461.968362] nova-compute[62208]: _type = "Task" [ 1461.968362] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1461.971239] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1461.971239] nova-compute[62208]: warnings.warn( [ 1461.976400] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f6911a-835e-b448-55e9-51200a6d10a2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1462.451458] nova-compute[62208]: DEBUG nova.compute.manager [req-51d524cb-ec4e-4ada-8d57-246b5b5b6a45 req-e2c07b3b-1569-4d1c-a171-5accd2915233 service nova] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Received event network-changed-17e02792-762e-43fb-88ac-ed1f53e7aa90 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1462.451710] nova-compute[62208]: DEBUG nova.compute.manager [req-51d524cb-ec4e-4ada-8d57-246b5b5b6a45 req-e2c07b3b-1569-4d1c-a171-5accd2915233 service nova] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Refreshing instance network info cache due to event network-changed-17e02792-762e-43fb-88ac-ed1f53e7aa90. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1462.451860] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-51d524cb-ec4e-4ada-8d57-246b5b5b6a45 req-e2c07b3b-1569-4d1c-a171-5accd2915233 service nova] Acquiring lock "refresh_cache-47cd2de6-8094-452e-afd7-aa42128a1b0c" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1462.452016] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-51d524cb-ec4e-4ada-8d57-246b5b5b6a45 req-e2c07b3b-1569-4d1c-a171-5accd2915233 service nova] Acquired lock "refresh_cache-47cd2de6-8094-452e-afd7-aa42128a1b0c" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1462.452490] nova-compute[62208]: DEBUG nova.network.neutron [req-51d524cb-ec4e-4ada-8d57-246b5b5b6a45 req-e2c07b3b-1569-4d1c-a171-5accd2915233 service nova] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Refreshing network info cache for port 17e02792-762e-43fb-88ac-ed1f53e7aa90 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1462.472653] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1462.472653] nova-compute[62208]: warnings.warn( [ 1462.479252] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1462.479523] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1462.479736] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1462.692685] nova-compute[62208]: DEBUG nova.network.neutron [req-51d524cb-ec4e-4ada-8d57-246b5b5b6a45 req-e2c07b3b-1569-4d1c-a171-5accd2915233 service nova] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Updated VIF entry in instance network info cache for port 17e02792-762e-43fb-88ac-ed1f53e7aa90. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1462.693086] nova-compute[62208]: DEBUG nova.network.neutron [req-51d524cb-ec4e-4ada-8d57-246b5b5b6a45 req-e2c07b3b-1569-4d1c-a171-5accd2915233 service nova] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Updating instance_info_cache with network_info: [{"id": "17e02792-762e-43fb-88ac-ed1f53e7aa90", "address": "fa:16:3e:96:7a:17", "network": {"id": "65109d25-d14a-4a19-93e6-98aec089cff0", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-977375263-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "05e12525b24d49cf8d295b53a9aeefca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccf76700-491b-4462-ab19-e6d3a9ff87ac", "external-id": "nsx-vlan-transportzone-956", "segmentation_id": 956, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap17e02792-76", "ovs_interfaceid": "17e02792-762e-43fb-88ac-ed1f53e7aa90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1462.703011] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-51d524cb-ec4e-4ada-8d57-246b5b5b6a45 req-e2c07b3b-1569-4d1c-a171-5accd2915233 service nova] Releasing lock "refresh_cache-47cd2de6-8094-452e-afd7-aa42128a1b0c" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1479.148845] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1484.142309] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1484.142650] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1485.136625] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1485.140310] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1485.140476] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1485.140602] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1485.162085] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1485.162397] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1485.162397] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1485.162504] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1485.162625] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1485.162746] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1485.162865] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1485.162991] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1485.163242] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1485.163242] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1485.163364] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1485.163819] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1486.141324] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1486.151369] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1486.151604] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1486.151772] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1486.151992] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1486.153078] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfe49b2c-40ec-4b11-a88e-e36a4831c090 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1486.155787] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1486.155787] nova-compute[62208]: warnings.warn( [ 1486.162297] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6aac1115-917b-4774-a105-72f5f93a8cce {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1486.165967] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1486.165967] nova-compute[62208]: warnings.warn( [ 1486.177987] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d7cb844-d430-4485-a7b1-969bb270e7ea {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1486.180512] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1486.180512] nova-compute[62208]: warnings.warn( [ 1486.185379] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cca84b5c-8d27-46bb-ac85-03f0afe5318b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1486.188303] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1486.188303] nova-compute[62208]: warnings.warn( [ 1486.214741] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181971MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1486.214892] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1486.215080] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1486.284581] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7f79eba6-e15c-4402-b46b-028d552a81d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1486.284736] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8f74dd97-b43c-49ef-8a83-401329ebfbdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1486.284864] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1486.284991] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c0d7e5a6-e905-47ee-87d7-cda8543be1f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1486.285115] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4d29dc3e-1090-49fd-83b7-96b8e6855ede actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1486.285235] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e45ad927-7d07-43d5-84b8-339c68981de6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1486.285351] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f000e638-100f-4a53-853d-4a94ffe71bed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1486.285467] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bbb642f2-aa5c-4e71-b25b-e32acd45f879 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1486.285582] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1486.285697] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 47cd2de6-8094-452e-afd7-aa42128a1b0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1486.298705] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1486.310231] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ad00920b-3783-4c01-bb25-4f923d29dad7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1486.321273] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ca1b4fca-a4bb-4a37-8e88-45e103a3579f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1486.321542] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1486.321681] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1486.489903] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1925840-1bc0-4fbe-8180-f2f4df281d9d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1486.492630] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1486.492630] nova-compute[62208]: warnings.warn( [ 1486.497800] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d867ca69-473d-44b6-9980-9755ee71aadf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1486.500995] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1486.500995] nova-compute[62208]: warnings.warn( [ 1486.529097] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abdf7a14-b1df-447b-a6dc-4f38371d4ec4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1486.531543] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1486.531543] nova-compute[62208]: warnings.warn( [ 1486.536923] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc1fe2f0-2b30-4406-8731-0c6b2f434751 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1486.540752] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1486.540752] nova-compute[62208]: warnings.warn( [ 1486.550586] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1486.559900] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1486.577980] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1486.578194] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.363s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1490.573021] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1490.596996] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1491.140910] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1491.141213] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1501.910986] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1501.910986] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1501.910986] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1501.910986] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1501.910986] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1501.910986] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1501.910986] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1501.910986] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1501.910986] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1501.910986] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1501.910986] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1501.910986] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1501.911766] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/2dda6f43-59de-4f60-a22f-dc21b92b458e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1501.914106] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1501.914376] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Copying Virtual Disk [datastore2] vmware_temp/2dda6f43-59de-4f60-a22f-dc21b92b458e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/2dda6f43-59de-4f60-a22f-dc21b92b458e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1501.914750] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-41cd75c0-8318-43b6-8e45-c7a907073c4a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.917387] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1501.917387] nova-compute[62208]: warnings.warn( [ 1501.923876] nova-compute[62208]: DEBUG oslo_vmware.api [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 1501.923876] nova-compute[62208]: value = "task-38603" [ 1501.923876] nova-compute[62208]: _type = "Task" [ 1501.923876] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1501.927137] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1501.927137] nova-compute[62208]: warnings.warn( [ 1501.932828] nova-compute[62208]: DEBUG oslo_vmware.api [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38603, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1502.428953] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1502.428953] nova-compute[62208]: warnings.warn( [ 1502.435065] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1502.435352] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1502.435968] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Traceback (most recent call last): [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] yield resources [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] self.driver.spawn(context, instance, image_meta, [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] self._fetch_image_if_missing(context, vi) [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] image_cache(vi, tmp_image_ds_loc) [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] vm_util.copy_virtual_disk( [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] session._wait_for_task(vmdk_copy_task) [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] return self.wait_for_task(task_ref) [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] return evt.wait() [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] result = hub.switch() [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] return self.greenlet.switch() [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] self.f(*self.args, **self.kw) [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] raise exceptions.translate_fault(task_info.error) [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Faults: ['InvalidArgument'] [ 1502.435968] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] [ 1502.437257] nova-compute[62208]: INFO nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Terminating instance [ 1502.437852] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1502.438095] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1502.438349] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6f340b7e-d3cc-4a2b-bbdc-7c20ad1dcadb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.440630] nova-compute[62208]: DEBUG nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1502.440807] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1502.441553] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1745db1-33c4-40ae-b94e-6bcf1ce4608e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.444387] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1502.444387] nova-compute[62208]: warnings.warn( [ 1502.444813] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1502.444813] nova-compute[62208]: warnings.warn( [ 1502.451033] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1502.452291] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-54b3f526-ba0a-4d11-835d-cd9a0c6bc8fd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.453983] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1502.454172] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1502.454854] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f21f3090-48d9-4e63-99c5-85b7608885cc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.457391] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1502.457391] nova-compute[62208]: warnings.warn( [ 1502.457747] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1502.457747] nova-compute[62208]: warnings.warn( [ 1502.460915] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Waiting for the task: (returnval){ [ 1502.460915] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52abf766-6e24-0f4f-c421-1c113f95c8ac" [ 1502.460915] nova-compute[62208]: _type = "Task" [ 1502.460915] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1502.464090] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1502.464090] nova-compute[62208]: warnings.warn( [ 1502.469341] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52abf766-6e24-0f4f-c421-1c113f95c8ac, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1502.524361] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1502.524361] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1502.524696] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Deleting the datastore file [datastore2] 7f79eba6-e15c-4402-b46b-028d552a81d4 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1502.524809] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b17f2a54-b318-4eaa-b34a-3a40783138c8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.526835] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1502.526835] nova-compute[62208]: warnings.warn( [ 1502.532133] nova-compute[62208]: DEBUG oslo_vmware.api [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 1502.532133] nova-compute[62208]: value = "task-38605" [ 1502.532133] nova-compute[62208]: _type = "Task" [ 1502.532133] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1502.535128] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1502.535128] nova-compute[62208]: warnings.warn( [ 1502.540552] nova-compute[62208]: DEBUG oslo_vmware.api [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38605, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1502.965894] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1502.965894] nova-compute[62208]: warnings.warn( [ 1502.971726] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1502.972186] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Creating directory with path [datastore2] vmware_temp/3b8fcfdb-201d-4250-8025-c74fd0ceb4ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1502.972560] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-68c0ed51-56bd-4789-9500-8b443ae6754a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.975413] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1502.975413] nova-compute[62208]: warnings.warn( [ 1502.985282] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Created directory with path [datastore2] vmware_temp/3b8fcfdb-201d-4250-8025-c74fd0ceb4ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1502.985702] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Fetch image to [datastore2] vmware_temp/3b8fcfdb-201d-4250-8025-c74fd0ceb4ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1502.986049] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/3b8fcfdb-201d-4250-8025-c74fd0ceb4ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1502.986945] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17ac7f3a-dbd9-4592-b010-9772918c26b4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.989439] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1502.989439] nova-compute[62208]: warnings.warn( [ 1502.994003] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b4295e6-8e61-43ea-81d3-70699b14b413 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.996325] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1502.996325] nova-compute[62208]: warnings.warn( [ 1503.003316] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8861f68-ca96-42b0-8ea9-b59c44cb1202 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.007513] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1503.007513] nova-compute[62208]: warnings.warn( [ 1503.037197] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cf001f4-3870-44a4-b8ef-9fa7caaac4dd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.039682] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1503.039682] nova-compute[62208]: warnings.warn( [ 1503.040266] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1503.040266] nova-compute[62208]: warnings.warn( [ 1503.045467] nova-compute[62208]: DEBUG oslo_vmware.api [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38605, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081771} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1503.047161] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1503.047513] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1503.047824] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1503.048162] nova-compute[62208]: INFO nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1503.050672] nova-compute[62208]: DEBUG nova.compute.claims [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9365ef6d0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1503.050973] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1503.051314] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1503.054511] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ea6ef148-3d34-435d-bede-27c329d2c25f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.056372] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1503.056372] nova-compute[62208]: warnings.warn( [ 1503.076427] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1503.128865] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3b8fcfdb-201d-4250-8025-c74fd0ceb4ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1503.185852] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1503.186258] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3b8fcfdb-201d-4250-8025-c74fd0ceb4ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1503.316080] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c54661c9-e091-41e5-ac89-3e638009f879 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.319211] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1503.319211] nova-compute[62208]: warnings.warn( [ 1503.324505] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f97028d-e532-4567-913a-48899e663870 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.327361] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1503.327361] nova-compute[62208]: warnings.warn( [ 1503.356389] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dc8b796-67eb-4dd5-855b-c17e6c719e41 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.358851] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1503.358851] nova-compute[62208]: warnings.warn( [ 1503.363882] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6293314a-53e6-4f5b-8a03-075e60f348e5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.367409] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1503.367409] nova-compute[62208]: warnings.warn( [ 1503.377277] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1503.385814] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1503.403990] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.353s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1503.404679] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Traceback (most recent call last): [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] self.driver.spawn(context, instance, image_meta, [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] self._fetch_image_if_missing(context, vi) [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] image_cache(vi, tmp_image_ds_loc) [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] vm_util.copy_virtual_disk( [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] session._wait_for_task(vmdk_copy_task) [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] return self.wait_for_task(task_ref) [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] return evt.wait() [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] result = hub.switch() [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] return self.greenlet.switch() [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] self.f(*self.args, **self.kw) [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] raise exceptions.translate_fault(task_info.error) [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Faults: ['InvalidArgument'] [ 1503.404679] nova-compute[62208]: ERROR nova.compute.manager [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] [ 1503.405669] nova-compute[62208]: DEBUG nova.compute.utils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1503.406592] nova-compute[62208]: DEBUG nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Build of instance 7f79eba6-e15c-4402-b46b-028d552a81d4 was re-scheduled: A specified parameter was not correct: fileType [ 1503.406592] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1503.406968] nova-compute[62208]: DEBUG nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1503.407144] nova-compute[62208]: DEBUG nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1503.407313] nova-compute[62208]: DEBUG nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1503.407475] nova-compute[62208]: DEBUG nova.network.neutron [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1503.728043] nova-compute[62208]: DEBUG nova.network.neutron [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1503.739865] nova-compute[62208]: INFO nova.compute.manager [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Took 0.33 seconds to deallocate network for instance. [ 1503.874776] nova-compute[62208]: INFO nova.scheduler.client.report [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Deleted allocations for instance 7f79eba6-e15c-4402-b46b-028d552a81d4 [ 1503.907792] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-aa648d57-d92f-4d9c-8df7-94ae0280e5b8 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "7f79eba6-e15c-4402-b46b-028d552a81d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 536.695s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1503.909159] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-232d2cf2-b0bc-45c1-9503-fbf2043dd983 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "7f79eba6-e15c-4402-b46b-028d552a81d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 341.211s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1503.909256] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-232d2cf2-b0bc-45c1-9503-fbf2043dd983 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "7f79eba6-e15c-4402-b46b-028d552a81d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1503.909647] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-232d2cf2-b0bc-45c1-9503-fbf2043dd983 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "7f79eba6-e15c-4402-b46b-028d552a81d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1503.909647] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-232d2cf2-b0bc-45c1-9503-fbf2043dd983 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "7f79eba6-e15c-4402-b46b-028d552a81d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1503.911602] nova-compute[62208]: INFO nova.compute.manager [None req-232d2cf2-b0bc-45c1-9503-fbf2043dd983 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Terminating instance [ 1503.913789] nova-compute[62208]: DEBUG nova.compute.manager [None req-232d2cf2-b0bc-45c1-9503-fbf2043dd983 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1503.914001] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-232d2cf2-b0bc-45c1-9503-fbf2043dd983 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1503.914516] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-97351125-36e2-4bff-aa61-c21ee5daab30 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.916469] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1503.916469] nova-compute[62208]: warnings.warn( [ 1503.923974] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e02e0244-9acf-4366-a41c-c21f630c8a6d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.937300] nova-compute[62208]: DEBUG nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1503.940084] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1503.940084] nova-compute[62208]: warnings.warn( [ 1503.963567] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-232d2cf2-b0bc-45c1-9503-fbf2043dd983 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7f79eba6-e15c-4402-b46b-028d552a81d4 could not be found. [ 1503.963789] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-232d2cf2-b0bc-45c1-9503-fbf2043dd983 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1503.963974] nova-compute[62208]: INFO nova.compute.manager [None req-232d2cf2-b0bc-45c1-9503-fbf2043dd983 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1503.964257] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-232d2cf2-b0bc-45c1-9503-fbf2043dd983 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1503.964485] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1503.964898] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1504.000906] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1504.004816] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1504.004924] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1504.007379] nova-compute[62208]: INFO nova.compute.claims [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1504.012082] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] Took 0.05 seconds to deallocate network for instance. [ 1504.131133] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-232d2cf2-b0bc-45c1-9503-fbf2043dd983 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "7f79eba6-e15c-4402-b46b-028d552a81d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.222s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1504.132229] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "7f79eba6-e15c-4402-b46b-028d552a81d4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 329.709s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1504.132229] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7f79eba6-e15c-4402-b46b-028d552a81d4] During sync_power_state the instance has a pending task (deleting). Skip. [ 1504.132393] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "7f79eba6-e15c-4402-b46b-028d552a81d4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1504.253674] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7179a91-a75a-46f1-8c7f-802af219910c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1504.256486] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1504.256486] nova-compute[62208]: warnings.warn( [ 1504.262358] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cb73a1a-5af9-4800-ba93-e40455053fd9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1504.266204] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1504.266204] nova-compute[62208]: warnings.warn( [ 1504.296074] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a707e4c9-b6f1-4b63-bc4c-c27ae6cd069c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1504.299243] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1504.299243] nova-compute[62208]: warnings.warn( [ 1504.305566] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb8bebb0-a433-4f0f-9005-b4346d99f2c4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1504.310217] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1504.310217] nova-compute[62208]: warnings.warn( [ 1504.324888] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1504.335640] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1504.356013] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.351s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1504.356583] nova-compute[62208]: DEBUG nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1504.393755] nova-compute[62208]: DEBUG nova.compute.utils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1504.395900] nova-compute[62208]: DEBUG nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1504.396152] nova-compute[62208]: DEBUG nova.network.neutron [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1504.408499] nova-compute[62208]: DEBUG nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1504.447818] nova-compute[62208]: DEBUG nova.policy [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bd5497a94524d2d97f74b1fbaedd7f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a9b50d4a3e0c43d491d13e85d9a2bb8a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1504.489551] nova-compute[62208]: DEBUG nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1504.517856] nova-compute[62208]: DEBUG nova.virt.hardware [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1504.518162] nova-compute[62208]: DEBUG nova.virt.hardware [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1504.518348] nova-compute[62208]: DEBUG nova.virt.hardware [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1504.518506] nova-compute[62208]: DEBUG nova.virt.hardware [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1504.518654] nova-compute[62208]: DEBUG nova.virt.hardware [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1504.518804] nova-compute[62208]: DEBUG nova.virt.hardware [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1504.519043] nova-compute[62208]: DEBUG nova.virt.hardware [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1504.519220] nova-compute[62208]: DEBUG nova.virt.hardware [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1504.519397] nova-compute[62208]: DEBUG nova.virt.hardware [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1504.519563] nova-compute[62208]: DEBUG nova.virt.hardware [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1504.519742] nova-compute[62208]: DEBUG nova.virt.hardware [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1504.520723] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ab499cd-eb09-499b-a89b-5dcff13f2ebc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1504.523506] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1504.523506] nova-compute[62208]: warnings.warn( [ 1504.530393] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ed4e5e2-117f-475c-bd0c-f6f9cd4a7448 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1504.534202] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1504.534202] nova-compute[62208]: warnings.warn( [ 1504.837169] nova-compute[62208]: DEBUG nova.network.neutron [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Successfully created port: a931f062-eeeb-4c6e-b429-dba949650ec9 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1505.557005] nova-compute[62208]: DEBUG nova.network.neutron [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Successfully updated port: a931f062-eeeb-4c6e-b429-dba949650ec9 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1505.570736] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "refresh_cache-7311ba0c-9a1b-4482-a4eb-6afe993e6656" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1505.570736] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquired lock "refresh_cache-7311ba0c-9a1b-4482-a4eb-6afe993e6656" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1505.570736] nova-compute[62208]: DEBUG nova.network.neutron [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1505.613304] nova-compute[62208]: DEBUG nova.network.neutron [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1505.976786] nova-compute[62208]: DEBUG nova.network.neutron [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Updating instance_info_cache with network_info: [{"id": "a931f062-eeeb-4c6e-b429-dba949650ec9", "address": "fa:16:3e:e9:ed:2c", "network": {"id": "36e2d180-28fc-4e4c-90c8-e71996738298", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1469032612-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a9b50d4a3e0c43d491d13e85d9a2bb8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "459b8c74-0aa6-42b6-996a-42b1c5d7e5c6", "external-id": "nsx-vlan-transportzone-467", "segmentation_id": 467, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa931f062-ee", "ovs_interfaceid": "a931f062-eeeb-4c6e-b429-dba949650ec9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1505.993166] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Releasing lock "refresh_cache-7311ba0c-9a1b-4482-a4eb-6afe993e6656" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1505.993444] nova-compute[62208]: DEBUG nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Instance network_info: |[{"id": "a931f062-eeeb-4c6e-b429-dba949650ec9", "address": "fa:16:3e:e9:ed:2c", "network": {"id": "36e2d180-28fc-4e4c-90c8-e71996738298", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1469032612-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a9b50d4a3e0c43d491d13e85d9a2bb8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "459b8c74-0aa6-42b6-996a-42b1c5d7e5c6", "external-id": "nsx-vlan-transportzone-467", "segmentation_id": 467, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa931f062-ee", "ovs_interfaceid": "a931f062-eeeb-4c6e-b429-dba949650ec9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1505.993854] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e9:ed:2c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '459b8c74-0aa6-42b6-996a-42b1c5d7e5c6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a931f062-eeeb-4c6e-b429-dba949650ec9', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1506.001541] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1506.002082] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1506.002312] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-376360ea-e98f-4335-8891-108070e96683 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1506.018748] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1506.018748] nova-compute[62208]: warnings.warn( [ 1506.025360] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1506.025360] nova-compute[62208]: value = "task-38606" [ 1506.025360] nova-compute[62208]: _type = "Task" [ 1506.025360] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1506.030796] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1506.030796] nova-compute[62208]: warnings.warn( [ 1506.036281] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38606, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1506.039423] nova-compute[62208]: DEBUG nova.compute.manager [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Received event network-vif-plugged-a931f062-eeeb-4c6e-b429-dba949650ec9 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1506.039640] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] Acquiring lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1506.039885] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] Lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1506.040076] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] Lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1506.040252] nova-compute[62208]: DEBUG nova.compute.manager [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] No waiting events found dispatching network-vif-plugged-a931f062-eeeb-4c6e-b429-dba949650ec9 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1506.040418] nova-compute[62208]: WARNING nova.compute.manager [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Received unexpected event network-vif-plugged-a931f062-eeeb-4c6e-b429-dba949650ec9 for instance with vm_state building and task_state spawning. [ 1506.040576] nova-compute[62208]: DEBUG nova.compute.manager [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Received event network-changed-a931f062-eeeb-4c6e-b429-dba949650ec9 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1506.040995] nova-compute[62208]: DEBUG nova.compute.manager [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Refreshing instance network info cache due to event network-changed-a931f062-eeeb-4c6e-b429-dba949650ec9. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1506.040995] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] Acquiring lock "refresh_cache-7311ba0c-9a1b-4482-a4eb-6afe993e6656" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1506.041102] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] Acquired lock "refresh_cache-7311ba0c-9a1b-4482-a4eb-6afe993e6656" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1506.041283] nova-compute[62208]: DEBUG nova.network.neutron [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Refreshing network info cache for port a931f062-eeeb-4c6e-b429-dba949650ec9 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1506.329632] nova-compute[62208]: DEBUG nova.network.neutron [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Updated VIF entry in instance network info cache for port a931f062-eeeb-4c6e-b429-dba949650ec9. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1506.329977] nova-compute[62208]: DEBUG nova.network.neutron [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Updating instance_info_cache with network_info: [{"id": "a931f062-eeeb-4c6e-b429-dba949650ec9", "address": "fa:16:3e:e9:ed:2c", "network": {"id": "36e2d180-28fc-4e4c-90c8-e71996738298", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1469032612-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a9b50d4a3e0c43d491d13e85d9a2bb8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "459b8c74-0aa6-42b6-996a-42b1c5d7e5c6", "external-id": "nsx-vlan-transportzone-467", "segmentation_id": 467, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa931f062-ee", "ovs_interfaceid": "a931f062-eeeb-4c6e-b429-dba949650ec9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1506.340226] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-571b1d93-642a-4699-ab32-95d6bf0ac9b6 req-68e5571c-260c-44e8-a06f-08f455dab9b0 service nova] Releasing lock "refresh_cache-7311ba0c-9a1b-4482-a4eb-6afe993e6656" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1506.529895] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1506.529895] nova-compute[62208]: warnings.warn( [ 1506.535580] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38606, 'name': CreateVM_Task, 'duration_secs': 0.315724} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1506.535759] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1506.536364] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1506.536594] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1506.539464] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-288ac837-ecc6-47c6-bfae-951137ec4bf2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1506.549360] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1506.549360] nova-compute[62208]: warnings.warn( [ 1506.574201] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Reconfiguring VM instance to enable vnc on port - 5902 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1506.574201] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-eaa6b03c-3ae3-4fb4-a785-130b768d4b92 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1506.584347] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1506.584347] nova-compute[62208]: warnings.warn( [ 1506.592471] nova-compute[62208]: DEBUG oslo_vmware.api [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 1506.592471] nova-compute[62208]: value = "task-38607" [ 1506.592471] nova-compute[62208]: _type = "Task" [ 1506.592471] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1506.595387] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1506.595387] nova-compute[62208]: warnings.warn( [ 1506.601552] nova-compute[62208]: DEBUG oslo_vmware.api [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38607, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1507.096731] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1507.096731] nova-compute[62208]: warnings.warn( [ 1507.103160] nova-compute[62208]: DEBUG oslo_vmware.api [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38607, 'name': ReconfigVM_Task, 'duration_secs': 0.115104} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1507.103450] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Reconfigured VM instance to enable vnc on port - 5902 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1507.103664] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.567s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1507.103916] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1507.104110] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1507.104394] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1507.104659] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-036eab7b-7c3f-46eb-8480-e8fc1007c5b3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.106311] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1507.106311] nova-compute[62208]: warnings.warn( [ 1507.109869] nova-compute[62208]: DEBUG oslo_vmware.api [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 1507.109869] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52a2c34a-ccee-f71c-6ed6-d73497fd4125" [ 1507.109869] nova-compute[62208]: _type = "Task" [ 1507.109869] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1507.113157] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1507.113157] nova-compute[62208]: warnings.warn( [ 1507.118624] nova-compute[62208]: DEBUG oslo_vmware.api [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52a2c34a-ccee-f71c-6ed6-d73497fd4125, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1507.615773] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1507.615773] nova-compute[62208]: warnings.warn( [ 1507.624547] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1507.624968] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1507.625233] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1513.553259] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1668e682-8750-440a-bc2b-0fc26fcec522 tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Acquiring lock "47cd2de6-8094-452e-afd7-aa42128a1b0c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1539.144307] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1544.141960] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1545.136529] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1545.140532] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1545.140532] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1545.140532] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1545.161559] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1545.161852] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1545.161852] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1545.161959] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1545.162077] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1545.162194] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1545.162315] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1545.162430] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1545.162541] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1545.162653] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1545.162768] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1545.163269] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1546.141719] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1546.152565] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1546.152807] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1546.152977] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1546.153131] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1546.154233] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fc09cad-162c-4ead-b2c6-104e257053a1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.157215] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1546.157215] nova-compute[62208]: warnings.warn( [ 1546.163717] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbb79eb4-c682-421e-af47-084d142dcc9f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.166944] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1546.166944] nova-compute[62208]: warnings.warn( [ 1546.179018] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db11ed33-7561-40de-906a-6eca15c69441 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.181302] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1546.181302] nova-compute[62208]: warnings.warn( [ 1546.185938] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a062d2d9-f680-469c-bc10-ce7819dff4e9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.188990] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1546.188990] nova-compute[62208]: warnings.warn( [ 1546.214733] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181974MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1546.214902] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1546.215099] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1546.282438] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8f74dd97-b43c-49ef-8a83-401329ebfbdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1546.282620] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1546.282751] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c0d7e5a6-e905-47ee-87d7-cda8543be1f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1546.282872] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4d29dc3e-1090-49fd-83b7-96b8e6855ede actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1546.282989] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e45ad927-7d07-43d5-84b8-339c68981de6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1546.283110] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f000e638-100f-4a53-853d-4a94ffe71bed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1546.283225] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bbb642f2-aa5c-4e71-b25b-e32acd45f879 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1546.283341] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1546.283457] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 47cd2de6-8094-452e-afd7-aa42128a1b0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1546.283570] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1546.294803] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ad00920b-3783-4c01-bb25-4f923d29dad7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1546.305486] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ca1b4fca-a4bb-4a37-8e88-45e103a3579f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1546.305716] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1546.305861] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1546.447701] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61d5f140-661a-4b09-9dab-e7b5daa146ec {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.450257] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1546.450257] nova-compute[62208]: warnings.warn( [ 1546.456402] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18517bf3-9029-42ec-acec-734ea9d845c4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.459389] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1546.459389] nova-compute[62208]: warnings.warn( [ 1546.486754] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40c8fdbe-17d6-4781-a0d8-6e46752b30a4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.489088] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1546.489088] nova-compute[62208]: warnings.warn( [ 1546.494132] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f50db65f-fafd-448e-a31b-c371ce16e60d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.497725] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1546.497725] nova-compute[62208]: warnings.warn( [ 1546.507220] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1546.516935] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1546.534635] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1546.535074] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.320s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1547.535318] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1551.141092] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1551.141092] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1551.141092] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1551.141092] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1551.141092] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1551.141092] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1551.141092] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1551.141092] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1551.141092] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1551.141092] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1551.141092] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1551.141092] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1551.141765] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/3b8fcfdb-201d-4250-8025-c74fd0ceb4ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1551.143920] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1551.144247] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Copying Virtual Disk [datastore2] vmware_temp/3b8fcfdb-201d-4250-8025-c74fd0ceb4ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/3b8fcfdb-201d-4250-8025-c74fd0ceb4ab/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1551.144577] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-48d6102b-798a-402e-b3c1-62f0fc7e30c5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1551.147080] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1551.147080] nova-compute[62208]: warnings.warn( [ 1551.155036] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Waiting for the task: (returnval){ [ 1551.155036] nova-compute[62208]: value = "task-38608" [ 1551.155036] nova-compute[62208]: _type = "Task" [ 1551.155036] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1551.158332] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1551.158332] nova-compute[62208]: warnings.warn( [ 1551.164195] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Task: {'id': task-38608, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1551.660049] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1551.660049] nova-compute[62208]: warnings.warn( [ 1551.666742] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1551.667334] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1551.668359] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Traceback (most recent call last): [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] yield resources [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] self.driver.spawn(context, instance, image_meta, [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] self._fetch_image_if_missing(context, vi) [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] image_cache(vi, tmp_image_ds_loc) [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] vm_util.copy_virtual_disk( [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] session._wait_for_task(vmdk_copy_task) [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] return self.wait_for_task(task_ref) [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] return evt.wait() [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] result = hub.switch() [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] return self.greenlet.switch() [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] self.f(*self.args, **self.kw) [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] raise exceptions.translate_fault(task_info.error) [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Faults: ['InvalidArgument'] [ 1551.668359] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] [ 1551.669971] nova-compute[62208]: INFO nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Terminating instance [ 1551.672948] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1551.672948] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1551.672948] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cf38673f-750f-4bd4-b9fa-6a8312980adf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1551.676657] nova-compute[62208]: DEBUG nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1551.676657] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1551.677658] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f348b049-4d2d-448d-8ed7-2ac25b57a1b9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1551.681653] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1551.681653] nova-compute[62208]: warnings.warn( [ 1551.682206] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1551.682206] nova-compute[62208]: warnings.warn( [ 1551.688047] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1551.689457] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0aca6eec-84e5-441a-8798-fbbfe05d86bb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1551.691750] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1551.692012] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1551.693063] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-15724990-71ce-4aa8-b40b-80e8ac9bdb7b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1551.696055] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1551.696055] nova-compute[62208]: warnings.warn( [ 1551.696594] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1551.696594] nova-compute[62208]: warnings.warn( [ 1551.700287] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Waiting for the task: (returnval){ [ 1551.700287] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5277a6ea-0ea9-9401-608d-738aaf7ca3de" [ 1551.700287] nova-compute[62208]: _type = "Task" [ 1551.700287] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1551.704034] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1551.704034] nova-compute[62208]: warnings.warn( [ 1551.712581] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5277a6ea-0ea9-9401-608d-738aaf7ca3de, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1551.838488] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1551.839009] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1551.839233] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Deleting the datastore file [datastore2] 8f74dd97-b43c-49ef-8a83-401329ebfbdb {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1551.840248] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-50476a0b-6f3c-49d3-a9a6-a7ce72de0b0f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1551.842541] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1551.842541] nova-compute[62208]: warnings.warn( [ 1551.848712] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Waiting for the task: (returnval){ [ 1551.848712] nova-compute[62208]: value = "task-38610" [ 1551.848712] nova-compute[62208]: _type = "Task" [ 1551.848712] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1551.852812] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1551.852812] nova-compute[62208]: warnings.warn( [ 1551.858991] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Task: {'id': task-38610, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1552.141061] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1552.141272] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1552.141511] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1552.204057] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1552.204057] nova-compute[62208]: warnings.warn( [ 1552.210421] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1552.210681] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Creating directory with path [datastore2] vmware_temp/d6c5a4f4-3277-4acc-b5dc-4d01ee127055/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1552.210925] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-324e06bf-8ef7-46d9-abab-b94ede52c72d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.212881] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1552.212881] nova-compute[62208]: warnings.warn( [ 1552.223715] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Created directory with path [datastore2] vmware_temp/d6c5a4f4-3277-4acc-b5dc-4d01ee127055/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1552.223968] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Fetch image to [datastore2] vmware_temp/d6c5a4f4-3277-4acc-b5dc-4d01ee127055/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1552.224106] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/d6c5a4f4-3277-4acc-b5dc-4d01ee127055/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1552.224897] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98a5b415-4336-4a42-9694-2e5004ebbfb2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.227422] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1552.227422] nova-compute[62208]: warnings.warn( [ 1552.232689] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78932955-6841-4b74-9fdc-50d3f4d5b81e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.235053] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1552.235053] nova-compute[62208]: warnings.warn( [ 1552.243006] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dddd41e0-0ad4-41cf-9148-92438bd4ba94 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.248019] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1552.248019] nova-compute[62208]: warnings.warn( [ 1552.275854] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf707aa5-6901-44c5-881c-1eb6ab84d686 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.278436] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1552.278436] nova-compute[62208]: warnings.warn( [ 1552.282955] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a9b6882f-9ac3-4307-9cd3-fb192b5ab05e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.284730] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1552.284730] nova-compute[62208]: warnings.warn( [ 1552.306965] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1552.352813] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1552.352813] nova-compute[62208]: warnings.warn( [ 1552.358695] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Task: {'id': task-38610, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.105388} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1552.359814] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d6c5a4f4-3277-4acc-b5dc-4d01ee127055/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1552.361213] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1552.361410] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1552.361587] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1552.361767] nova-compute[62208]: INFO nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Took 0.69 seconds to destroy the instance on the hypervisor. [ 1552.364444] nova-compute[62208]: DEBUG nova.compute.claims [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Aborting claim: <nova.compute.claims.Claim object at 0x7fb937369c30> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1552.364661] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1552.364890] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1552.423782] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1552.423950] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d6c5a4f4-3277-4acc-b5dc-4d01ee127055/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1552.592351] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d74e89e-89bd-463a-ad4a-3620125bd57a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.595369] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1552.595369] nova-compute[62208]: warnings.warn( [ 1552.601072] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e85715d5-c5a4-4969-b9c5-e913ec8963b9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.604420] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1552.604420] nova-compute[62208]: warnings.warn( [ 1552.632409] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5b18fc0-5af7-4dfb-8a4b-a3e027a25093 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.634947] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1552.634947] nova-compute[62208]: warnings.warn( [ 1552.640556] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5ada407-6c42-4a65-8607-80919e76dc62 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.645729] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1552.645729] nova-compute[62208]: warnings.warn( [ 1552.655987] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1552.665116] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1552.682517] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.317s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1552.683032] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Traceback (most recent call last): [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] self.driver.spawn(context, instance, image_meta, [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] self._fetch_image_if_missing(context, vi) [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] image_cache(vi, tmp_image_ds_loc) [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] vm_util.copy_virtual_disk( [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] session._wait_for_task(vmdk_copy_task) [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] return self.wait_for_task(task_ref) [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] return evt.wait() [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] result = hub.switch() [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] return self.greenlet.switch() [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] self.f(*self.args, **self.kw) [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] raise exceptions.translate_fault(task_info.error) [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Faults: ['InvalidArgument'] [ 1552.683032] nova-compute[62208]: ERROR nova.compute.manager [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] [ 1552.683960] nova-compute[62208]: DEBUG nova.compute.utils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1552.685168] nova-compute[62208]: DEBUG nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Build of instance 8f74dd97-b43c-49ef-8a83-401329ebfbdb was re-scheduled: A specified parameter was not correct: fileType [ 1552.685168] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1552.685680] nova-compute[62208]: DEBUG nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1552.685875] nova-compute[62208]: DEBUG nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1552.686053] nova-compute[62208]: DEBUG nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1552.686219] nova-compute[62208]: DEBUG nova.network.neutron [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1553.551310] nova-compute[62208]: DEBUG nova.network.neutron [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1553.563348] nova-compute[62208]: INFO nova.compute.manager [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Took 0.88 seconds to deallocate network for instance. [ 1553.680046] nova-compute[62208]: INFO nova.scheduler.client.report [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Deleted allocations for instance 8f74dd97-b43c-49ef-8a83-401329ebfbdb [ 1553.712349] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6ad8a301-eeef-4aba-8863-ceaf18c13212 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 541.613s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1553.713044] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 344.807s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1553.713278] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1553.713487] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1553.713655] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1553.720124] nova-compute[62208]: INFO nova.compute.manager [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Terminating instance [ 1553.722758] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1553.722974] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquired lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1553.723105] nova-compute[62208]: DEBUG nova.network.neutron [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1553.734609] nova-compute[62208]: DEBUG nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1553.791490] nova-compute[62208]: DEBUG nova.network.neutron [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1553.817875] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1553.818267] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1553.819874] nova-compute[62208]: INFO nova.compute.claims [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1553.883016] nova-compute[62208]: DEBUG nova.network.neutron [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1553.897193] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Releasing lock "refresh_cache-8f74dd97-b43c-49ef-8a83-401329ebfbdb" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1553.897631] nova-compute[62208]: DEBUG nova.compute.manager [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1553.897821] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1553.898427] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7312fe13-76bd-4a0e-9d81-72fc6ff2a092 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1553.901033] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1553.901033] nova-compute[62208]: warnings.warn( [ 1553.910188] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78248f83-02a0-4301-b804-1af772731b8d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1553.923420] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1553.923420] nova-compute[62208]: warnings.warn( [ 1553.941813] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8f74dd97-b43c-49ef-8a83-401329ebfbdb could not be found. [ 1553.942020] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1553.942205] nova-compute[62208]: INFO nova.compute.manager [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1553.942461] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1553.945183] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1553.945288] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1553.975801] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1553.985041] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1553.993726] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 8f74dd97-b43c-49ef-8a83-401329ebfbdb] Took 0.05 seconds to deallocate network for instance. [ 1554.067822] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb884cfd-9092-46c0-bb10-cd4bc6d7a2ba {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1554.070900] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1554.070900] nova-compute[62208]: warnings.warn( [ 1554.077250] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ed117b2-50f9-477f-8453-5edc29371512 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1554.081104] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1554.081104] nova-compute[62208]: warnings.warn( [ 1554.113425] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-999b0073-a421-4750-a8dc-eab580627efa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1554.116903] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f9bf5843-6762-4ea1-955b-2079b06677c6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "8f74dd97-b43c-49ef-8a83-401329ebfbdb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.403s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1554.117259] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1554.117259] nova-compute[62208]: warnings.warn( [ 1554.123492] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a860b0b-0b62-4901-8e89-734dd72d6d05 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1554.128870] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1554.128870] nova-compute[62208]: warnings.warn( [ 1554.140059] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1554.150563] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1554.170358] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.352s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1554.170862] nova-compute[62208]: DEBUG nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1554.209012] nova-compute[62208]: DEBUG nova.compute.utils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1554.210299] nova-compute[62208]: DEBUG nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1554.210477] nova-compute[62208]: DEBUG nova.network.neutron [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1554.222225] nova-compute[62208]: DEBUG nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1554.262507] nova-compute[62208]: DEBUG nova.policy [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2df841e548dc40349fa4f0d8e5dffd85', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7de608ed8dbd42b29b2a1da85885ed92', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1554.294948] nova-compute[62208]: DEBUG nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1554.318452] nova-compute[62208]: DEBUG nova.virt.hardware [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1554.318713] nova-compute[62208]: DEBUG nova.virt.hardware [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1554.318909] nova-compute[62208]: DEBUG nova.virt.hardware [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1554.319102] nova-compute[62208]: DEBUG nova.virt.hardware [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1554.319256] nova-compute[62208]: DEBUG nova.virt.hardware [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1554.319404] nova-compute[62208]: DEBUG nova.virt.hardware [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1554.319614] nova-compute[62208]: DEBUG nova.virt.hardware [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1554.319774] nova-compute[62208]: DEBUG nova.virt.hardware [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1554.319946] nova-compute[62208]: DEBUG nova.virt.hardware [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1554.320132] nova-compute[62208]: DEBUG nova.virt.hardware [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1554.320315] nova-compute[62208]: DEBUG nova.virt.hardware [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1554.321253] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7b5fd3c-0b5b-4a35-8d13-6f0cede56af5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1554.325877] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1554.325877] nova-compute[62208]: warnings.warn( [ 1554.332262] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c1b88f1-9a9b-4e5b-a3ba-7c9173ce51a4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1554.336150] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1554.336150] nova-compute[62208]: warnings.warn( [ 1554.841163] nova-compute[62208]: DEBUG nova.network.neutron [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Successfully created port: d2955078-2d2e-4c2d-842d-9033d2bb0ef8 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1555.227419] nova-compute[62208]: DEBUG nova.network.neutron [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Successfully created port: ae5a3ee1-d0c2-4c07-a368-d520fd1182a9 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1555.866510] nova-compute[62208]: DEBUG nova.network.neutron [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Successfully updated port: d2955078-2d2e-4c2d-842d-9033d2bb0ef8 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1556.588693] nova-compute[62208]: DEBUG nova.network.neutron [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Successfully updated port: ae5a3ee1-d0c2-4c07-a368-d520fd1182a9 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1556.601814] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "refresh_cache-ad00920b-3783-4c01-bb25-4f923d29dad7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1556.601814] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquired lock "refresh_cache-ad00920b-3783-4c01-bb25-4f923d29dad7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1556.601814] nova-compute[62208]: DEBUG nova.network.neutron [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1556.653553] nova-compute[62208]: DEBUG nova.network.neutron [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1557.033136] nova-compute[62208]: DEBUG nova.network.neutron [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Updating instance_info_cache with network_info: [{"id": "d2955078-2d2e-4c2d-842d-9033d2bb0ef8", "address": "fa:16:3e:73:a7:ff", "network": {"id": "b6cb4b9b-769a-4e25-a9e9-f12e12c8fd29", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1519723084", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c1cf14cf-4f9c-41af-90d0-62e363eb4fba", "external-id": "nsx-vlan-transportzone-521", "segmentation_id": 521, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2955078-2d", "ovs_interfaceid": "d2955078-2d2e-4c2d-842d-9033d2bb0ef8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ae5a3ee1-d0c2-4c07-a368-d520fd1182a9", "address": "fa:16:3e:8c:0f:72", "network": {"id": "a9071794-6fdb-4c7e-b88a-fedf33d8dbb6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1734419472", "subnets": [{"cidr": "10.0.0.16/28", "dns": [], "gateway": {"address": "10.0.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.18"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "05b1253d-2b87-4158-9ff1-dafcf829f11f", "external-id": "nsx-vlan-transportzone-55", "segmentation_id": 55, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapae5a3ee1-d0", "ovs_interfaceid": "ae5a3ee1-d0c2-4c07-a368-d520fd1182a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1557.052332] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Releasing lock "refresh_cache-ad00920b-3783-4c01-bb25-4f923d29dad7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1557.052690] nova-compute[62208]: DEBUG nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Instance network_info: |[{"id": "d2955078-2d2e-4c2d-842d-9033d2bb0ef8", "address": "fa:16:3e:73:a7:ff", "network": {"id": "b6cb4b9b-769a-4e25-a9e9-f12e12c8fd29", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1519723084", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c1cf14cf-4f9c-41af-90d0-62e363eb4fba", "external-id": "nsx-vlan-transportzone-521", "segmentation_id": 521, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2955078-2d", "ovs_interfaceid": "d2955078-2d2e-4c2d-842d-9033d2bb0ef8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ae5a3ee1-d0c2-4c07-a368-d520fd1182a9", "address": "fa:16:3e:8c:0f:72", "network": {"id": "a9071794-6fdb-4c7e-b88a-fedf33d8dbb6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1734419472", "subnets": [{"cidr": "10.0.0.16/28", "dns": [], "gateway": {"address": "10.0.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.18"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "05b1253d-2b87-4158-9ff1-dafcf829f11f", "external-id": "nsx-vlan-transportzone-55", "segmentation_id": 55, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapae5a3ee1-d0", "ovs_interfaceid": "ae5a3ee1-d0c2-4c07-a368-d520fd1182a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1557.053180] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:73:a7:ff', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c1cf14cf-4f9c-41af-90d0-62e363eb4fba', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd2955078-2d2e-4c2d-842d-9033d2bb0ef8', 'vif_model': 'e1000'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:8c:0f:72', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '05b1253d-2b87-4158-9ff1-dafcf829f11f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ae5a3ee1-d0c2-4c07-a368-d520fd1182a9', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1557.062555] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1557.063144] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1557.063374] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a12b86fd-a861-40d5-bb38-2990b4bd99b8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.080064] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1557.080064] nova-compute[62208]: warnings.warn( [ 1557.086542] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1557.086542] nova-compute[62208]: value = "task-38611" [ 1557.086542] nova-compute[62208]: _type = "Task" [ 1557.086542] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1557.089752] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1557.089752] nova-compute[62208]: warnings.warn( [ 1557.095378] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38611, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1557.406666] nova-compute[62208]: DEBUG nova.compute.manager [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Received event network-vif-plugged-d2955078-2d2e-4c2d-842d-9033d2bb0ef8 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1557.407081] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] Acquiring lock "ad00920b-3783-4c01-bb25-4f923d29dad7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1557.407339] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] Lock "ad00920b-3783-4c01-bb25-4f923d29dad7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1557.407588] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] Lock "ad00920b-3783-4c01-bb25-4f923d29dad7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1557.407837] nova-compute[62208]: DEBUG nova.compute.manager [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] No waiting events found dispatching network-vif-plugged-d2955078-2d2e-4c2d-842d-9033d2bb0ef8 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1557.408193] nova-compute[62208]: WARNING nova.compute.manager [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Received unexpected event network-vif-plugged-d2955078-2d2e-4c2d-842d-9033d2bb0ef8 for instance with vm_state building and task_state spawning. [ 1557.408438] nova-compute[62208]: DEBUG nova.compute.manager [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Received event network-changed-d2955078-2d2e-4c2d-842d-9033d2bb0ef8 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1557.408732] nova-compute[62208]: DEBUG nova.compute.manager [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Refreshing instance network info cache due to event network-changed-d2955078-2d2e-4c2d-842d-9033d2bb0ef8. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1557.409010] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] Acquiring lock "refresh_cache-ad00920b-3783-4c01-bb25-4f923d29dad7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1557.409210] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] Acquired lock "refresh_cache-ad00920b-3783-4c01-bb25-4f923d29dad7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1557.409437] nova-compute[62208]: DEBUG nova.network.neutron [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Refreshing network info cache for port d2955078-2d2e-4c2d-842d-9033d2bb0ef8 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1557.590994] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1557.590994] nova-compute[62208]: warnings.warn( [ 1557.597821] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38611, 'name': CreateVM_Task, 'duration_secs': 0.374791} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1557.600495] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1557.601216] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1557.601443] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1557.604578] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e3e59cb-a83a-49b6-9d3c-4a36ba490bd0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.619216] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1557.619216] nova-compute[62208]: warnings.warn( [ 1557.645085] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Reconfiguring VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1557.646021] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-c93cce88-d2dc-48be-8983-cc93a46e6708 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.657051] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1557.657051] nova-compute[62208]: warnings.warn( [ 1557.663717] nova-compute[62208]: DEBUG oslo_vmware.api [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Waiting for the task: (returnval){ [ 1557.663717] nova-compute[62208]: value = "task-38612" [ 1557.663717] nova-compute[62208]: _type = "Task" [ 1557.663717] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1557.667527] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1557.667527] nova-compute[62208]: warnings.warn( [ 1557.674568] nova-compute[62208]: DEBUG oslo_vmware.api [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Task: {'id': task-38612, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1557.716518] nova-compute[62208]: DEBUG nova.network.neutron [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Updated VIF entry in instance network info cache for port d2955078-2d2e-4c2d-842d-9033d2bb0ef8. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1557.716928] nova-compute[62208]: DEBUG nova.network.neutron [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Updating instance_info_cache with network_info: [{"id": "d2955078-2d2e-4c2d-842d-9033d2bb0ef8", "address": "fa:16:3e:73:a7:ff", "network": {"id": "b6cb4b9b-769a-4e25-a9e9-f12e12c8fd29", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1519723084", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c1cf14cf-4f9c-41af-90d0-62e363eb4fba", "external-id": "nsx-vlan-transportzone-521", "segmentation_id": 521, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2955078-2d", "ovs_interfaceid": "d2955078-2d2e-4c2d-842d-9033d2bb0ef8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ae5a3ee1-d0c2-4c07-a368-d520fd1182a9", "address": "fa:16:3e:8c:0f:72", "network": {"id": "a9071794-6fdb-4c7e-b88a-fedf33d8dbb6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1734419472", "subnets": [{"cidr": "10.0.0.16/28", "dns": [], "gateway": {"address": "10.0.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.18"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "05b1253d-2b87-4158-9ff1-dafcf829f11f", "external-id": "nsx-vlan-transportzone-55", "segmentation_id": 55, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapae5a3ee1-d0", "ovs_interfaceid": "ae5a3ee1-d0c2-4c07-a368-d520fd1182a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1557.727863] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] Releasing lock "refresh_cache-ad00920b-3783-4c01-bb25-4f923d29dad7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1557.728142] nova-compute[62208]: DEBUG nova.compute.manager [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Received event network-vif-plugged-ae5a3ee1-d0c2-4c07-a368-d520fd1182a9 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1557.728343] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] Acquiring lock "ad00920b-3783-4c01-bb25-4f923d29dad7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1557.728542] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] Lock "ad00920b-3783-4c01-bb25-4f923d29dad7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1557.728706] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] Lock "ad00920b-3783-4c01-bb25-4f923d29dad7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1557.728868] nova-compute[62208]: DEBUG nova.compute.manager [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] No waiting events found dispatching network-vif-plugged-ae5a3ee1-d0c2-4c07-a368-d520fd1182a9 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1557.729034] nova-compute[62208]: WARNING nova.compute.manager [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Received unexpected event network-vif-plugged-ae5a3ee1-d0c2-4c07-a368-d520fd1182a9 for instance with vm_state building and task_state spawning. [ 1557.729194] nova-compute[62208]: DEBUG nova.compute.manager [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Received event network-changed-ae5a3ee1-d0c2-4c07-a368-d520fd1182a9 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1557.729346] nova-compute[62208]: DEBUG nova.compute.manager [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Refreshing instance network info cache due to event network-changed-ae5a3ee1-d0c2-4c07-a368-d520fd1182a9. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1557.729536] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] Acquiring lock "refresh_cache-ad00920b-3783-4c01-bb25-4f923d29dad7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1557.729667] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] Acquired lock "refresh_cache-ad00920b-3783-4c01-bb25-4f923d29dad7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1557.729831] nova-compute[62208]: DEBUG nova.network.neutron [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Refreshing network info cache for port ae5a3ee1-d0c2-4c07-a368-d520fd1182a9 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1557.987218] nova-compute[62208]: DEBUG nova.network.neutron [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Updated VIF entry in instance network info cache for port ae5a3ee1-d0c2-4c07-a368-d520fd1182a9. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1557.987859] nova-compute[62208]: DEBUG nova.network.neutron [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Updating instance_info_cache with network_info: [{"id": "d2955078-2d2e-4c2d-842d-9033d2bb0ef8", "address": "fa:16:3e:73:a7:ff", "network": {"id": "b6cb4b9b-769a-4e25-a9e9-f12e12c8fd29", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1519723084", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c1cf14cf-4f9c-41af-90d0-62e363eb4fba", "external-id": "nsx-vlan-transportzone-521", "segmentation_id": 521, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2955078-2d", "ovs_interfaceid": "d2955078-2d2e-4c2d-842d-9033d2bb0ef8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ae5a3ee1-d0c2-4c07-a368-d520fd1182a9", "address": "fa:16:3e:8c:0f:72", "network": {"id": "a9071794-6fdb-4c7e-b88a-fedf33d8dbb6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1734419472", "subnets": [{"cidr": "10.0.0.16/28", "dns": [], "gateway": {"address": "10.0.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.18"}}], "meta": {"injected": false, "tenant_id": "7de608ed8dbd42b29b2a1da85885ed92", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "05b1253d-2b87-4158-9ff1-dafcf829f11f", "external-id": "nsx-vlan-transportzone-55", "segmentation_id": 55, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapae5a3ee1-d0", "ovs_interfaceid": "ae5a3ee1-d0c2-4c07-a368-d520fd1182a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1558.000104] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4e597ca3-413c-42b8-8c8f-ac832fb2284d req-f38d67bf-926d-4776-9f23-0b1669b07c13 service nova] Releasing lock "refresh_cache-ad00920b-3783-4c01-bb25-4f923d29dad7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1558.168171] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1558.168171] nova-compute[62208]: warnings.warn( [ 1558.174226] nova-compute[62208]: DEBUG oslo_vmware.api [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Task: {'id': task-38612, 'name': ReconfigVM_Task, 'duration_secs': 0.107679} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1558.174500] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Reconfigured VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1558.174713] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.573s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1558.174973] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1558.175119] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1558.175450] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1558.175727] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-99ec28da-fd2b-485f-a7ab-7f514144e654 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.177613] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1558.177613] nova-compute[62208]: warnings.warn( [ 1558.181240] nova-compute[62208]: DEBUG oslo_vmware.api [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Waiting for the task: (returnval){ [ 1558.181240] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52b14117-74cc-bdd1-ea8f-86427fb198e2" [ 1558.181240] nova-compute[62208]: _type = "Task" [ 1558.181240] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1558.184412] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1558.184412] nova-compute[62208]: warnings.warn( [ 1558.190677] nova-compute[62208]: DEBUG oslo_vmware.api [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52b14117-74cc-bdd1-ea8f-86427fb198e2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1558.569996] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-fa863e60-41cb-4576-ba98-5e673b6e51cb tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1558.685833] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1558.685833] nova-compute[62208]: warnings.warn( [ 1558.692487] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1558.693059] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1558.693462] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1561.216514] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "68b1024d-2bfd-4999-9ba2-f2558c223885" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1561.216866] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "68b1024d-2bfd-4999-9ba2-f2558c223885" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1596.533486] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1596.533889] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1600.142306] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1601.941332] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1601.941332] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1601.941332] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1601.941332] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1601.941332] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1601.941332] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1601.941332] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1601.941332] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1601.941332] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1601.941332] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1601.941332] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1601.941332] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1601.942066] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/d6c5a4f4-3277-4acc-b5dc-4d01ee127055/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1601.944073] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1601.944343] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Copying Virtual Disk [datastore2] vmware_temp/d6c5a4f4-3277-4acc-b5dc-4d01ee127055/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/d6c5a4f4-3277-4acc-b5dc-4d01ee127055/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1601.944631] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-37c3188f-abaf-47fc-ada5-b05cee85a1f9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1601.947048] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1601.947048] nova-compute[62208]: warnings.warn( [ 1601.953229] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Waiting for the task: (returnval){ [ 1601.953229] nova-compute[62208]: value = "task-38613" [ 1601.953229] nova-compute[62208]: _type = "Task" [ 1601.953229] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1601.956752] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1601.956752] nova-compute[62208]: warnings.warn( [ 1601.961707] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Task: {'id': task-38613, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1602.457452] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1602.457452] nova-compute[62208]: warnings.warn( [ 1602.463970] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1602.466304] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1602.466304] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Traceback (most recent call last): [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] yield resources [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] self.driver.spawn(context, instance, image_meta, [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] self._fetch_image_if_missing(context, vi) [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] image_cache(vi, tmp_image_ds_loc) [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] vm_util.copy_virtual_disk( [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] session._wait_for_task(vmdk_copy_task) [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] return self.wait_for_task(task_ref) [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] return evt.wait() [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] result = hub.switch() [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] return self.greenlet.switch() [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] self.f(*self.args, **self.kw) [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] raise exceptions.translate_fault(task_info.error) [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Faults: ['InvalidArgument'] [ 1602.466304] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] [ 1602.466304] nova-compute[62208]: INFO nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Terminating instance [ 1602.467702] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1602.467702] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1602.467702] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-678915c7-2eb3-499f-b87d-4a8d75922c64 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1602.469719] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1602.470114] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1602.470730] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca43315a-0ff4-43be-be1d-9795b64e941b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1602.473576] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1602.473576] nova-compute[62208]: warnings.warn( [ 1602.474010] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1602.474010] nova-compute[62208]: warnings.warn( [ 1602.478784] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1602.478784] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e08279c7-1fe6-4819-b11f-58fbd779c6a3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1602.481278] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1602.481392] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1602.481980] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1602.481980] nova-compute[62208]: warnings.warn( [ 1602.482437] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-87b0653d-91a4-4986-a173-e499e79b8276 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1602.484434] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1602.484434] nova-compute[62208]: warnings.warn( [ 1602.487685] nova-compute[62208]: DEBUG oslo_vmware.api [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Waiting for the task: (returnval){ [ 1602.487685] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]529c996b-7c1c-8082-7389-293fad438cdb" [ 1602.487685] nova-compute[62208]: _type = "Task" [ 1602.487685] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1602.491305] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1602.491305] nova-compute[62208]: warnings.warn( [ 1602.496924] nova-compute[62208]: DEBUG oslo_vmware.api [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]529c996b-7c1c-8082-7389-293fad438cdb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1602.542658] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1602.542894] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1602.543137] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Deleting the datastore file [datastore2] 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1602.543492] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8710dbfc-d45d-4d9c-85d4-8123b29c900e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1602.545368] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1602.545368] nova-compute[62208]: warnings.warn( [ 1602.550899] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Waiting for the task: (returnval){ [ 1602.550899] nova-compute[62208]: value = "task-38615" [ 1602.550899] nova-compute[62208]: _type = "Task" [ 1602.550899] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1602.554282] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1602.554282] nova-compute[62208]: warnings.warn( [ 1602.560128] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Task: {'id': task-38615, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1602.582483] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "ad00920b-3783-4c01-bb25-4f923d29dad7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1602.992158] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1602.992158] nova-compute[62208]: warnings.warn( [ 1602.998140] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1602.998411] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Creating directory with path [datastore2] vmware_temp/5b1f47d7-edee-4214-8ca8-98cd891f9006/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1602.998673] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2d514d2b-eb9f-4e98-872b-a6d0df921a80 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.000581] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1603.000581] nova-compute[62208]: warnings.warn( [ 1603.011304] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Created directory with path [datastore2] vmware_temp/5b1f47d7-edee-4214-8ca8-98cd891f9006/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1603.011503] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Fetch image to [datastore2] vmware_temp/5b1f47d7-edee-4214-8ca8-98cd891f9006/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1603.011673] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/5b1f47d7-edee-4214-8ca8-98cd891f9006/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1603.012473] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81d4eb62-8a28-45bf-b110-c7a41e85c473 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.014935] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1603.014935] nova-compute[62208]: warnings.warn( [ 1603.019441] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b910fa5b-731d-4918-a60c-63d4cd25e9e8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.021726] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1603.021726] nova-compute[62208]: warnings.warn( [ 1603.029036] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-281f2723-fa76-4127-a41c-2f57cd418a78 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.034084] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1603.034084] nova-compute[62208]: warnings.warn( [ 1603.065671] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e80dd75-4829-45b8-98ae-ad2a7939318c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.067968] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1603.067968] nova-compute[62208]: warnings.warn( [ 1603.068382] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1603.068382] nova-compute[62208]: warnings.warn( [ 1603.074738] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7198080d-3097-4659-bd25-e5a9e783975c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.076498] nova-compute[62208]: DEBUG oslo_vmware.api [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Task: {'id': task-38615, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078515} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1603.076794] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1603.076982] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1603.077153] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1603.077329] nova-compute[62208]: INFO nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1603.078759] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1603.078759] nova-compute[62208]: warnings.warn( [ 1603.079595] nova-compute[62208]: DEBUG nova.compute.claims [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9364dca30> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1603.079767] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1603.079997] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1603.099546] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1603.155703] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5b1f47d7-edee-4214-8ca8-98cd891f9006/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1603.213655] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1603.213881] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5b1f47d7-edee-4214-8ca8-98cd891f9006/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1603.339022] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-656c4c35-70b3-4c73-b177-ce8ce812ff6c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.341518] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1603.341518] nova-compute[62208]: warnings.warn( [ 1603.346589] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c37d4735-181d-4307-8dcf-60b558003ac3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.349708] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1603.349708] nova-compute[62208]: warnings.warn( [ 1603.377060] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c852f236-ef75-49d1-a11c-841e427411f2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.379520] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1603.379520] nova-compute[62208]: warnings.warn( [ 1603.384791] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd4ea9a5-d157-4383-8f81-b797874cffc7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.388430] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1603.388430] nova-compute[62208]: warnings.warn( [ 1603.398088] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1603.406389] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1603.422882] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.343s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1603.423413] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Traceback (most recent call last): [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] self.driver.spawn(context, instance, image_meta, [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] self._fetch_image_if_missing(context, vi) [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] image_cache(vi, tmp_image_ds_loc) [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] vm_util.copy_virtual_disk( [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] session._wait_for_task(vmdk_copy_task) [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] return self.wait_for_task(task_ref) [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] return evt.wait() [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] result = hub.switch() [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] return self.greenlet.switch() [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] self.f(*self.args, **self.kw) [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] raise exceptions.translate_fault(task_info.error) [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Faults: ['InvalidArgument'] [ 1603.423413] nova-compute[62208]: ERROR nova.compute.manager [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] [ 1603.424413] nova-compute[62208]: DEBUG nova.compute.utils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1603.425594] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Build of instance 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a was re-scheduled: A specified parameter was not correct: fileType [ 1603.425594] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1603.426009] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1603.426185] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1603.426357] nova-compute[62208]: DEBUG nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1603.426522] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1603.734110] nova-compute[62208]: DEBUG nova.network.neutron [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1603.750450] nova-compute[62208]: INFO nova.compute.manager [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Took 0.32 seconds to deallocate network for instance. [ 1603.855804] nova-compute[62208]: INFO nova.scheduler.client.report [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Deleted allocations for instance 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a [ 1603.872610] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-4d0b058a-7f00-47df-adca-619951160aea tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Lock "8b4b316e-5e6b-4455-bd0e-016cb89b9b9a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 523.766s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1603.873899] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b471f9d-f735-4a8c-b869-a605092d41f0 tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Lock "8b4b316e-5e6b-4455-bd0e-016cb89b9b9a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 328.370s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1603.874235] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b471f9d-f735-4a8c-b869-a605092d41f0 tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Acquiring lock "8b4b316e-5e6b-4455-bd0e-016cb89b9b9a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1603.874883] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b471f9d-f735-4a8c-b869-a605092d41f0 tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Lock "8b4b316e-5e6b-4455-bd0e-016cb89b9b9a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1603.875153] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b471f9d-f735-4a8c-b869-a605092d41f0 tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Lock "8b4b316e-5e6b-4455-bd0e-016cb89b9b9a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1603.877232] nova-compute[62208]: INFO nova.compute.manager [None req-8b471f9d-f735-4a8c-b869-a605092d41f0 tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Terminating instance [ 1603.879143] nova-compute[62208]: DEBUG nova.compute.manager [None req-8b471f9d-f735-4a8c-b869-a605092d41f0 tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1603.879472] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b471f9d-f735-4a8c-b869-a605092d41f0 tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1603.879958] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b65db74f-fc91-4bc8-93cd-028aaefc2a57 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.882078] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1603.882078] nova-compute[62208]: warnings.warn( [ 1603.883323] nova-compute[62208]: DEBUG nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1603.890287] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f1364eb-c161-4a81-a14f-5fcf63e85482 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.901769] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1603.901769] nova-compute[62208]: warnings.warn( [ 1603.922109] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-8b471f9d-f735-4a8c-b869-a605092d41f0 tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a could not be found. [ 1603.922328] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8b471f9d-f735-4a8c-b869-a605092d41f0 tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1603.922634] nova-compute[62208]: INFO nova.compute.manager [None req-8b471f9d-f735-4a8c-b869-a605092d41f0 tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1603.923025] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-8b471f9d-f735-4a8c-b869-a605092d41f0 tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1603.925435] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1603.925527] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1603.939774] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1603.940079] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1603.941611] nova-compute[62208]: INFO nova.compute.claims [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1603.955593] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1603.964714] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 8b4b316e-5e6b-4455-bd0e-016cb89b9b9a] Took 0.04 seconds to deallocate network for instance. [ 1604.072942] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8b471f9d-f735-4a8c-b869-a605092d41f0 tempest-ServerRescueTestJSONUnderV235-997901436 tempest-ServerRescueTestJSONUnderV235-997901436-project-member] Lock "8b4b316e-5e6b-4455-bd0e-016cb89b9b9a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.199s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1604.141062] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1604.168548] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-800e1fb9-b21b-4b4d-a28d-676a0bd454da {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1604.170975] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1604.170975] nova-compute[62208]: warnings.warn( [ 1604.176813] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e82acdb2-2c40-424d-8ab8-02df74d5ae60 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1604.180077] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1604.180077] nova-compute[62208]: warnings.warn( [ 1604.207170] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36d494ac-5f07-4dd8-aac7-1ab4abe55319 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1604.209482] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1604.209482] nova-compute[62208]: warnings.warn( [ 1604.214517] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51bf66b3-a1a5-42fc-b226-babddf80db7a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1604.218357] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1604.218357] nova-compute[62208]: warnings.warn( [ 1604.228445] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1604.238360] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1604.254246] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.314s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1604.254719] nova-compute[62208]: DEBUG nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1604.288208] nova-compute[62208]: DEBUG nova.compute.utils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1604.289696] nova-compute[62208]: DEBUG nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1604.289862] nova-compute[62208]: DEBUG nova.network.neutron [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1604.302872] nova-compute[62208]: DEBUG nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1604.336755] nova-compute[62208]: DEBUG nova.policy [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6cb3f0377ac64412bf238ba3e97ecd9a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4fb2ff705fe34117b2dfb9354ae8cfc8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1604.377746] nova-compute[62208]: DEBUG nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1604.404712] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1604.404952] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1604.405109] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1604.405281] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1604.405421] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1604.405563] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1604.405766] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1604.405939] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1604.406102] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1604.406257] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1604.406419] nova-compute[62208]: DEBUG nova.virt.hardware [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1604.407277] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-278f5777-3860-47e6-bd5c-ba3dfab3d1ab {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1604.409828] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1604.409828] nova-compute[62208]: warnings.warn( [ 1604.416088] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16efe42f-55d7-44c7-9d37-e116ef33e783 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1604.419644] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1604.419644] nova-compute[62208]: warnings.warn( [ 1604.724887] nova-compute[62208]: DEBUG nova.network.neutron [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Successfully created port: ff2a0d46-81d9-4147-8be4-b8c1281c3e37 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1605.136380] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1605.465315] nova-compute[62208]: DEBUG nova.network.neutron [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Successfully updated port: ff2a0d46-81d9-4147-8be4-b8c1281c3e37 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1605.475645] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "refresh_cache-ca1b4fca-a4bb-4a37-8e88-45e103a3579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1605.475787] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "refresh_cache-ca1b4fca-a4bb-4a37-8e88-45e103a3579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1605.475933] nova-compute[62208]: DEBUG nova.network.neutron [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1605.541077] nova-compute[62208]: DEBUG nova.network.neutron [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1605.711102] nova-compute[62208]: DEBUG nova.network.neutron [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Updating instance_info_cache with network_info: [{"id": "ff2a0d46-81d9-4147-8be4-b8c1281c3e37", "address": "fa:16:3e:a8:e1:a0", "network": {"id": "ce724fe2-66d9-4ed9-8fe0-d32189d49488", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1765832645-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "4fb2ff705fe34117b2dfb9354ae8cfc8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13af9422-d668-4413-b63a-766558d83a3b", "external-id": "nsx-vlan-transportzone-842", "segmentation_id": 842, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapff2a0d46-81", "ovs_interfaceid": "ff2a0d46-81d9-4147-8be4-b8c1281c3e37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1605.724622] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "refresh_cache-ca1b4fca-a4bb-4a37-8e88-45e103a3579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1605.724925] nova-compute[62208]: DEBUG nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Instance network_info: |[{"id": "ff2a0d46-81d9-4147-8be4-b8c1281c3e37", "address": "fa:16:3e:a8:e1:a0", "network": {"id": "ce724fe2-66d9-4ed9-8fe0-d32189d49488", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1765832645-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "4fb2ff705fe34117b2dfb9354ae8cfc8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13af9422-d668-4413-b63a-766558d83a3b", "external-id": "nsx-vlan-transportzone-842", "segmentation_id": 842, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapff2a0d46-81", "ovs_interfaceid": "ff2a0d46-81d9-4147-8be4-b8c1281c3e37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1605.725361] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a8:e1:a0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '13af9422-d668-4413-b63a-766558d83a3b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ff2a0d46-81d9-4147-8be4-b8c1281c3e37', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1605.733044] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1605.733658] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1605.733879] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d2b28f97-ec57-4f6e-bc76-e3e02144b682 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1605.748559] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1605.748559] nova-compute[62208]: warnings.warn( [ 1605.754709] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1605.754709] nova-compute[62208]: value = "task-38616" [ 1605.754709] nova-compute[62208]: _type = "Task" [ 1605.754709] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1605.758086] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1605.758086] nova-compute[62208]: warnings.warn( [ 1605.764239] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38616, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1605.826639] nova-compute[62208]: DEBUG nova.compute.manager [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Received event network-vif-plugged-ff2a0d46-81d9-4147-8be4-b8c1281c3e37 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1605.826884] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] Acquiring lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1605.827155] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] Lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1605.827276] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] Lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1605.827389] nova-compute[62208]: DEBUG nova.compute.manager [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] No waiting events found dispatching network-vif-plugged-ff2a0d46-81d9-4147-8be4-b8c1281c3e37 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1605.827538] nova-compute[62208]: WARNING nova.compute.manager [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Received unexpected event network-vif-plugged-ff2a0d46-81d9-4147-8be4-b8c1281c3e37 for instance with vm_state building and task_state spawning. [ 1605.827699] nova-compute[62208]: DEBUG nova.compute.manager [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Received event network-changed-ff2a0d46-81d9-4147-8be4-b8c1281c3e37 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1605.827852] nova-compute[62208]: DEBUG nova.compute.manager [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Refreshing instance network info cache due to event network-changed-ff2a0d46-81d9-4147-8be4-b8c1281c3e37. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1605.828160] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] Acquiring lock "refresh_cache-ca1b4fca-a4bb-4a37-8e88-45e103a3579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1605.828248] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] Acquired lock "refresh_cache-ca1b4fca-a4bb-4a37-8e88-45e103a3579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1605.828363] nova-compute[62208]: DEBUG nova.network.neutron [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Refreshing network info cache for port ff2a0d46-81d9-4147-8be4-b8c1281c3e37 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1606.143107] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1606.259704] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1606.259704] nova-compute[62208]: warnings.warn( [ 1606.265704] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38616, 'name': CreateVM_Task, 'duration_secs': 0.313026} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1606.265997] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1606.266685] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1606.267048] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1606.276061] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf205de9-fed7-44a5-8f50-fcad5ff5d698 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.280584] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1606.280584] nova-compute[62208]: warnings.warn( [ 1606.308489] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Reconfiguring VM instance to enable vnc on port - 5903 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1606.308989] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-95651f4d-25b7-4c60-9809-76790b3ad163 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.319127] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1606.319127] nova-compute[62208]: warnings.warn( [ 1606.326608] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 1606.326608] nova-compute[62208]: value = "task-38617" [ 1606.326608] nova-compute[62208]: _type = "Task" [ 1606.326608] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1606.329885] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1606.329885] nova-compute[62208]: warnings.warn( [ 1606.337072] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38617, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1606.367800] nova-compute[62208]: DEBUG nova.network.neutron [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Updated VIF entry in instance network info cache for port ff2a0d46-81d9-4147-8be4-b8c1281c3e37. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1606.369172] nova-compute[62208]: DEBUG nova.network.neutron [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Updating instance_info_cache with network_info: [{"id": "ff2a0d46-81d9-4147-8be4-b8c1281c3e37", "address": "fa:16:3e:a8:e1:a0", "network": {"id": "ce724fe2-66d9-4ed9-8fe0-d32189d49488", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1765832645-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "4fb2ff705fe34117b2dfb9354ae8cfc8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13af9422-d668-4413-b63a-766558d83a3b", "external-id": "nsx-vlan-transportzone-842", "segmentation_id": 842, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapff2a0d46-81", "ovs_interfaceid": "ff2a0d46-81d9-4147-8be4-b8c1281c3e37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1606.380063] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9745e5ec-7150-4f46-9c93-627242cecfab req-f7971fe0-7951-435e-a68a-cf70a76b94b0 service nova] Releasing lock "refresh_cache-ca1b4fca-a4bb-4a37-8e88-45e103a3579f" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1606.831317] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1606.831317] nova-compute[62208]: warnings.warn( [ 1606.837156] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38617, 'name': ReconfigVM_Task, 'duration_secs': 0.105758} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1606.837443] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Reconfigured VM instance to enable vnc on port - 5903 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1606.837654] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.571s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1606.837935] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1606.838107] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1606.838452] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1606.838713] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0937acba-c2e4-4979-844a-9bfaa4449793 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.840323] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1606.840323] nova-compute[62208]: warnings.warn( [ 1606.843519] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 1606.843519] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52349186-0029-8bb2-4cbb-e51e7e20b197" [ 1606.843519] nova-compute[62208]: _type = "Task" [ 1606.843519] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1606.846618] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1606.846618] nova-compute[62208]: warnings.warn( [ 1606.851929] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52349186-0029-8bb2-4cbb-e51e7e20b197, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1607.141583] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1607.141644] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1607.141713] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1607.162714] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1607.163081] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1607.163081] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1607.163157] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1607.163256] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1607.163567] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1607.163567] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1607.163697] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1607.163731] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1607.163828] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1607.163950] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1607.164530] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1607.177350] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1607.177627] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1607.177627] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1607.177772] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1607.178981] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-334e35f0-dfb0-4aa4-9e3c-464e9512f0d7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.182580] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1607.182580] nova-compute[62208]: warnings.warn( [ 1607.189473] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f30caf7f-966c-41ba-8937-9dce2f12b795 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.194155] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1607.194155] nova-compute[62208]: warnings.warn( [ 1607.205666] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e545a1bb-a8cc-4aa1-9e36-89e2e598ef50 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.208176] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1607.208176] nova-compute[62208]: warnings.warn( [ 1607.213364] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20c590b7-ffe9-4abf-88c0-2dbf5da41d49 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.217341] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1607.217341] nova-compute[62208]: warnings.warn( [ 1607.245974] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181957MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1607.245974] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1607.246133] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1607.321004] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c0d7e5a6-e905-47ee-87d7-cda8543be1f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1607.321159] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4d29dc3e-1090-49fd-83b7-96b8e6855ede actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1607.321318] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e45ad927-7d07-43d5-84b8-339c68981de6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1607.321402] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f000e638-100f-4a53-853d-4a94ffe71bed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1607.321515] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bbb642f2-aa5c-4e71-b25b-e32acd45f879 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1607.321629] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1607.321734] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 47cd2de6-8094-452e-afd7-aa42128a1b0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1607.321841] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1607.321958] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ad00920b-3783-4c01-bb25-4f923d29dad7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1607.322094] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ca1b4fca-a4bb-4a37-8e88-45e103a3579f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1607.342009] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 68b1024d-2bfd-4999-9ba2-f2558c223885 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1607.349147] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1607.349147] nova-compute[62208]: warnings.warn( [ 1607.356636] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1607.357859] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1607.358068] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1607.358284] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1607.370705] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1607.370952] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1607.371106] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1607.430408] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1607.430899] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1607.552295] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e0edeed-ec01-4de6-8c46-dc3df185a21b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.555257] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1607.555257] nova-compute[62208]: warnings.warn( [ 1607.562089] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26d09fbe-6c8b-450a-8946-a06ce3ff9f15 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.565179] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1607.565179] nova-compute[62208]: warnings.warn( [ 1607.593618] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f64c7536-3cae-494f-a264-56d30fac028e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.596277] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1607.596277] nova-compute[62208]: warnings.warn( [ 1607.601494] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56d7b91e-bdf0-4409-a6cb-2ff61ee524e6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.605305] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1607.605305] nova-compute[62208]: warnings.warn( [ 1607.616951] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1607.626187] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1607.644627] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1607.644967] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.399s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1608.621511] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1608.857648] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1608.857947] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1610.098873] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a334d70d-723c-437b-8e0c-12a59c7d6579 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "638c73fd-6c45-4f50-8078-2a61f3339ad2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1610.099235] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a334d70d-723c-437b-8e0c-12a59c7d6579 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "638c73fd-6c45-4f50-8078-2a61f3339ad2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1612.141717] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1614.141361] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1614.141737] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1615.136697] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1650.565747] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1650.565747] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1650.565747] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1650.565747] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1650.565747] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1650.565747] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1650.565747] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1650.565747] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1650.565747] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1650.565747] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1650.565747] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1650.565747] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1650.566703] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/5b1f47d7-edee-4214-8ca8-98cd891f9006/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1650.568055] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1650.568326] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Copying Virtual Disk [datastore2] vmware_temp/5b1f47d7-edee-4214-8ca8-98cd891f9006/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/5b1f47d7-edee-4214-8ca8-98cd891f9006/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1650.568621] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-55201bb8-0843-4f7a-8bf6-aefffef82e2e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1650.571175] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1650.571175] nova-compute[62208]: warnings.warn( [ 1650.577881] nova-compute[62208]: DEBUG oslo_vmware.api [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Waiting for the task: (returnval){ [ 1650.577881] nova-compute[62208]: value = "task-38618" [ 1650.577881] nova-compute[62208]: _type = "Task" [ 1650.577881] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1650.581354] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1650.581354] nova-compute[62208]: warnings.warn( [ 1650.586714] nova-compute[62208]: DEBUG oslo_vmware.api [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Task: {'id': task-38618, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1651.082386] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.082386] nova-compute[62208]: warnings.warn( [ 1651.088418] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1651.088667] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1651.089431] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Traceback (most recent call last): [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] yield resources [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] self.driver.spawn(context, instance, image_meta, [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] self._fetch_image_if_missing(context, vi) [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] image_cache(vi, tmp_image_ds_loc) [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] vm_util.copy_virtual_disk( [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] session._wait_for_task(vmdk_copy_task) [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] return self.wait_for_task(task_ref) [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] return evt.wait() [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] result = hub.switch() [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] return self.greenlet.switch() [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] self.f(*self.args, **self.kw) [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] raise exceptions.translate_fault(task_info.error) [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Faults: ['InvalidArgument'] [ 1651.089431] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] [ 1651.090577] nova-compute[62208]: INFO nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Terminating instance [ 1651.091452] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1651.091656] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1651.091891] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7c2e921e-2a01-40b3-8949-3ab691241cfc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.095669] nova-compute[62208]: DEBUG nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1651.095867] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1651.096619] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af2545f8-13e2-45c1-8a9b-d276ebf9b96e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.099030] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.099030] nova-compute[62208]: warnings.warn( [ 1651.099393] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.099393] nova-compute[62208]: warnings.warn( [ 1651.103764] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1651.104112] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-195c7d1f-5df4-4064-b774-bd56da99d93a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.106383] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1651.106616] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1651.107188] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.107188] nova-compute[62208]: warnings.warn( [ 1651.107612] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9e75ccdb-e8f0-4b15-8aac-6eb929bfeb04 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.109757] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.109757] nova-compute[62208]: warnings.warn( [ 1651.112835] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 1651.112835] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52d11251-f0e7-f535-8196-f2d177b3b6a3" [ 1651.112835] nova-compute[62208]: _type = "Task" [ 1651.112835] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1651.115691] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.115691] nova-compute[62208]: warnings.warn( [ 1651.120393] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52d11251-f0e7-f535-8196-f2d177b3b6a3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1651.176629] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1651.176864] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1651.177047] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Deleting the datastore file [datastore2] 4d29dc3e-1090-49fd-83b7-96b8e6855ede {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1651.177328] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8efb9740-9860-445d-b605-2c4850ac3322 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.179185] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.179185] nova-compute[62208]: warnings.warn( [ 1651.183678] nova-compute[62208]: DEBUG oslo_vmware.api [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Waiting for the task: (returnval){ [ 1651.183678] nova-compute[62208]: value = "task-38620" [ 1651.183678] nova-compute[62208]: _type = "Task" [ 1651.183678] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1651.186952] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.186952] nova-compute[62208]: warnings.warn( [ 1651.191861] nova-compute[62208]: DEBUG oslo_vmware.api [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Task: {'id': task-38620, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1651.617986] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.617986] nova-compute[62208]: warnings.warn( [ 1651.625392] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1651.625761] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating directory with path [datastore2] vmware_temp/ce6178ab-8207-460f-98c2-12587f0e56d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1651.626057] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0ec5920b-d30a-4543-b69a-d957d11afaa7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.627763] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.627763] nova-compute[62208]: warnings.warn( [ 1651.637320] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Created directory with path [datastore2] vmware_temp/ce6178ab-8207-460f-98c2-12587f0e56d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1651.637606] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Fetch image to [datastore2] vmware_temp/ce6178ab-8207-460f-98c2-12587f0e56d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1651.637842] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/ce6178ab-8207-460f-98c2-12587f0e56d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1651.638686] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e4e71b5-6806-4ab1-a81e-fd597d685e17 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.641044] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.641044] nova-compute[62208]: warnings.warn( [ 1651.645946] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9be885f3-7759-43d1-be69-4f8df1e0d085 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.648199] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.648199] nova-compute[62208]: warnings.warn( [ 1651.655326] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ab99ca3-77d7-478a-bf11-63f870d3d768 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.658831] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.658831] nova-compute[62208]: warnings.warn( [ 1651.688184] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c77dff2-afaa-4ef1-bc36-51daac15ec71 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.690390] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.690390] nova-compute[62208]: warnings.warn( [ 1651.690803] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.690803] nova-compute[62208]: warnings.warn( [ 1651.697268] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-df5d4fe2-1993-49e1-afe1-1aaf5d9d6f3e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.699076] nova-compute[62208]: DEBUG oslo_vmware.api [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Task: {'id': task-38620, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077309} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1651.699391] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1651.699635] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1651.699866] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1651.700109] nova-compute[62208]: INFO nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1651.701545] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.701545] nova-compute[62208]: warnings.warn( [ 1651.702263] nova-compute[62208]: DEBUG nova.compute.claims [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936b72cb0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1651.702497] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1651.702797] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1651.724810] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1651.779016] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ce6178ab-8207-460f-98c2-12587f0e56d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1651.840674] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1651.840866] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ce6178ab-8207-460f-98c2-12587f0e56d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1651.993087] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87ebcec1-58bc-4c21-a99a-95e9d68e7431 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.995884] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1651.995884] nova-compute[62208]: warnings.warn( [ 1652.001304] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a13bfa2-f1fe-498b-99a7-11e3ccb1e2d8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.004155] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1652.004155] nova-compute[62208]: warnings.warn( [ 1652.032154] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bc3b9bc-607a-4187-99f5-1b10dbe9c6ba {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.034529] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1652.034529] nova-compute[62208]: warnings.warn( [ 1652.040513] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3b759c5-36da-4f71-be7e-b162c00e50df {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.044282] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1652.044282] nova-compute[62208]: warnings.warn( [ 1652.054055] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1652.064099] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1652.081708] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.379s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1652.082249] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Traceback (most recent call last): [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] self.driver.spawn(context, instance, image_meta, [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] self._fetch_image_if_missing(context, vi) [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] image_cache(vi, tmp_image_ds_loc) [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] vm_util.copy_virtual_disk( [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] session._wait_for_task(vmdk_copy_task) [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] return self.wait_for_task(task_ref) [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] return evt.wait() [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] result = hub.switch() [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] return self.greenlet.switch() [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] self.f(*self.args, **self.kw) [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] raise exceptions.translate_fault(task_info.error) [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Faults: ['InvalidArgument'] [ 1652.082249] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] [ 1652.083282] nova-compute[62208]: DEBUG nova.compute.utils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1652.084641] nova-compute[62208]: DEBUG nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Build of instance 4d29dc3e-1090-49fd-83b7-96b8e6855ede was re-scheduled: A specified parameter was not correct: fileType [ 1652.084641] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1652.085007] nova-compute[62208]: DEBUG nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1652.085178] nova-compute[62208]: DEBUG nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1652.085346] nova-compute[62208]: DEBUG nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1652.085509] nova-compute[62208]: DEBUG nova.network.neutron [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1652.410977] nova-compute[62208]: DEBUG nova.network.neutron [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1652.422686] nova-compute[62208]: INFO nova.compute.manager [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Took 0.34 seconds to deallocate network for instance. [ 1652.524047] nova-compute[62208]: INFO nova.scheduler.client.report [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Deleted allocations for instance 4d29dc3e-1090-49fd-83b7-96b8e6855ede [ 1652.542574] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-05c4e0ec-ca24-40be-8a87-5d3d082b1b07 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Lock "4d29dc3e-1090-49fd-83b7-96b8e6855ede" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 481.231s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1652.543671] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f6fbcafb-d8c0-4a39-8816-7fd780fd6571 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Lock "4d29dc3e-1090-49fd-83b7-96b8e6855ede" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 285.101s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1652.543899] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f6fbcafb-d8c0-4a39-8816-7fd780fd6571 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Acquiring lock "4d29dc3e-1090-49fd-83b7-96b8e6855ede-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1652.544122] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f6fbcafb-d8c0-4a39-8816-7fd780fd6571 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Lock "4d29dc3e-1090-49fd-83b7-96b8e6855ede-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1652.544308] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f6fbcafb-d8c0-4a39-8816-7fd780fd6571 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Lock "4d29dc3e-1090-49fd-83b7-96b8e6855ede-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1652.546263] nova-compute[62208]: INFO nova.compute.manager [None req-f6fbcafb-d8c0-4a39-8816-7fd780fd6571 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Terminating instance [ 1652.548397] nova-compute[62208]: DEBUG nova.compute.manager [None req-f6fbcafb-d8c0-4a39-8816-7fd780fd6571 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1652.548592] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbcafb-d8c0-4a39-8816-7fd780fd6571 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1652.549080] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ca4ad2ce-dfb4-4c94-8e4e-4aa78a85a954 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.551104] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1652.551104] nova-compute[62208]: warnings.warn( [ 1652.558257] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd8e9439-f2ec-4196-a80d-787721cb1b50 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.569372] nova-compute[62208]: DEBUG nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1652.571860] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1652.571860] nova-compute[62208]: warnings.warn( [ 1652.594061] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-f6fbcafb-d8c0-4a39-8816-7fd780fd6571 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4d29dc3e-1090-49fd-83b7-96b8e6855ede could not be found. [ 1652.594416] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbcafb-d8c0-4a39-8816-7fd780fd6571 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1652.594665] nova-compute[62208]: INFO nova.compute.manager [None req-f6fbcafb-d8c0-4a39-8816-7fd780fd6571 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1652.595012] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-f6fbcafb-d8c0-4a39-8816-7fd780fd6571 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1652.595314] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1652.595462] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1652.640637] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1652.640917] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1652.642559] nova-compute[62208]: INFO nova.compute.claims [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1652.751006] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1652.759961] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 4d29dc3e-1090-49fd-83b7-96b8e6855ede] Took 0.16 seconds to deallocate network for instance. [ 1652.853211] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f6fbcafb-d8c0-4a39-8816-7fd780fd6571 tempest-ServerRescueTestJSON-780895511 tempest-ServerRescueTestJSON-780895511-project-member] Lock "4d29dc3e-1090-49fd-83b7-96b8e6855ede" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.309s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1652.867123] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adb36fd4-ee11-46e6-b6b9-c28839a560ca {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.870352] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1652.870352] nova-compute[62208]: warnings.warn( [ 1652.875958] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a04cab2-7d79-49b0-8355-1b415e05a498 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.879453] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1652.879453] nova-compute[62208]: warnings.warn( [ 1652.907481] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b752bd4-2541-467b-80de-15dca03ab48a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.909827] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1652.909827] nova-compute[62208]: warnings.warn( [ 1652.915327] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8650f02a-c5b8-4321-a53f-7d100cce658a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.919553] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1652.919553] nova-compute[62208]: warnings.warn( [ 1652.930848] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1652.940247] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1652.956320] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.315s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1652.956938] nova-compute[62208]: DEBUG nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1652.996028] nova-compute[62208]: DEBUG nova.compute.utils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1652.997280] nova-compute[62208]: DEBUG nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1652.997547] nova-compute[62208]: DEBUG nova.network.neutron [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1653.012620] nova-compute[62208]: DEBUG nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1653.047715] nova-compute[62208]: DEBUG nova.policy [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7534a5a8a37e4451918e35c8b93d4ad5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8eef1e68dea42cf98f03dc8db29498a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1653.087732] nova-compute[62208]: DEBUG nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1653.110617] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1653.110934] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1653.111142] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1653.111382] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1653.111585] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1653.111786] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1653.112111] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1653.112342] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1653.112671] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1653.112900] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1653.113147] nova-compute[62208]: DEBUG nova.virt.hardware [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1653.114058] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4024292-e45f-4fd7-b907-5ec1c92c5b0b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.116670] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1653.116670] nova-compute[62208]: warnings.warn( [ 1653.123264] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e21391b0-d25b-420e-a641-9af7596775be {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.126979] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1653.126979] nova-compute[62208]: warnings.warn( [ 1653.336702] nova-compute[62208]: DEBUG nova.network.neutron [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Successfully created port: a6959b48-2398-4336-998b-f0bdcc79d00b {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1654.000910] nova-compute[62208]: DEBUG nova.network.neutron [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Successfully updated port: a6959b48-2398-4336-998b-f0bdcc79d00b {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1654.016882] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "refresh_cache-68b1024d-2bfd-4999-9ba2-f2558c223885" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1654.017035] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "refresh_cache-68b1024d-2bfd-4999-9ba2-f2558c223885" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1654.017253] nova-compute[62208]: DEBUG nova.network.neutron [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1654.070016] nova-compute[62208]: DEBUG nova.network.neutron [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1654.226868] nova-compute[62208]: DEBUG nova.network.neutron [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Updating instance_info_cache with network_info: [{"id": "a6959b48-2398-4336-998b-f0bdcc79d00b", "address": "fa:16:3e:be:db:a5", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa6959b48-23", "ovs_interfaceid": "a6959b48-2398-4336-998b-f0bdcc79d00b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1654.240910] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "refresh_cache-68b1024d-2bfd-4999-9ba2-f2558c223885" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1654.241224] nova-compute[62208]: DEBUG nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Instance network_info: |[{"id": "a6959b48-2398-4336-998b-f0bdcc79d00b", "address": "fa:16:3e:be:db:a5", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa6959b48-23", "ovs_interfaceid": "a6959b48-2398-4336-998b-f0bdcc79d00b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1654.241658] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:be:db:a5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'da623279-b6f6-4570-8b15-a332120b8b60', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a6959b48-2398-4336-998b-f0bdcc79d00b', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1654.249258] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1654.249826] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1654.250063] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1107f27d-9650-4bff-86f7-6367b5e1378a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.264443] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1654.264443] nova-compute[62208]: warnings.warn( [ 1654.270642] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1654.270642] nova-compute[62208]: value = "task-38621" [ 1654.270642] nova-compute[62208]: _type = "Task" [ 1654.270642] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1654.273738] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1654.273738] nova-compute[62208]: warnings.warn( [ 1654.279104] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38621, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1654.455339] nova-compute[62208]: DEBUG nova.compute.manager [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Received event network-vif-plugged-a6959b48-2398-4336-998b-f0bdcc79d00b {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1654.455570] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] Acquiring lock "68b1024d-2bfd-4999-9ba2-f2558c223885-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1654.455781] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] Lock "68b1024d-2bfd-4999-9ba2-f2558c223885-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1654.456233] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] Lock "68b1024d-2bfd-4999-9ba2-f2558c223885-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1654.456449] nova-compute[62208]: DEBUG nova.compute.manager [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] No waiting events found dispatching network-vif-plugged-a6959b48-2398-4336-998b-f0bdcc79d00b {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1654.456625] nova-compute[62208]: WARNING nova.compute.manager [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Received unexpected event network-vif-plugged-a6959b48-2398-4336-998b-f0bdcc79d00b for instance with vm_state building and task_state spawning. [ 1654.456793] nova-compute[62208]: DEBUG nova.compute.manager [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Received event network-changed-a6959b48-2398-4336-998b-f0bdcc79d00b {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1654.456954] nova-compute[62208]: DEBUG nova.compute.manager [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Refreshing instance network info cache due to event network-changed-a6959b48-2398-4336-998b-f0bdcc79d00b. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1654.457148] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] Acquiring lock "refresh_cache-68b1024d-2bfd-4999-9ba2-f2558c223885" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1654.457287] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] Acquired lock "refresh_cache-68b1024d-2bfd-4999-9ba2-f2558c223885" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1654.457441] nova-compute[62208]: DEBUG nova.network.neutron [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Refreshing network info cache for port a6959b48-2398-4336-998b-f0bdcc79d00b {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1654.742076] nova-compute[62208]: DEBUG nova.network.neutron [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Updated VIF entry in instance network info cache for port a6959b48-2398-4336-998b-f0bdcc79d00b. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1654.742190] nova-compute[62208]: DEBUG nova.network.neutron [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Updating instance_info_cache with network_info: [{"id": "a6959b48-2398-4336-998b-f0bdcc79d00b", "address": "fa:16:3e:be:db:a5", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa6959b48-23", "ovs_interfaceid": "a6959b48-2398-4336-998b-f0bdcc79d00b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1654.753343] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-1027a191-334d-4d76-923a-49d94b1f873d req-063c2cec-dcbf-4c50-beda-2147d5f3f643 service nova] Releasing lock "refresh_cache-68b1024d-2bfd-4999-9ba2-f2558c223885" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1654.776027] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1654.776027] nova-compute[62208]: warnings.warn( [ 1654.782876] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38621, 'name': CreateVM_Task, 'duration_secs': 0.313069} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1654.783055] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1654.783625] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1654.783839] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1654.787490] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc8e7483-6446-4d7b-a468-d2f6c7fe40c7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.798934] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1654.798934] nova-compute[62208]: warnings.warn( [ 1654.822196] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Reconfiguring VM instance to enable vnc on port - 5904 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1654.822695] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-b75c598d-67e6-49f2-9cee-7e4c7fcb54d9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.832872] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1654.832872] nova-compute[62208]: warnings.warn( [ 1654.838612] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 1654.838612] nova-compute[62208]: value = "task-38622" [ 1654.838612] nova-compute[62208]: _type = "Task" [ 1654.838612] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1654.841425] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1654.841425] nova-compute[62208]: warnings.warn( [ 1654.846624] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38622, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1655.344077] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1655.344077] nova-compute[62208]: warnings.warn( [ 1655.350202] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38622, 'name': ReconfigVM_Task, 'duration_secs': 0.107214} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1655.350581] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Reconfigured VM instance to enable vnc on port - 5904 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1655.350805] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.567s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1655.351064] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1655.351209] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1655.351551] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1655.351820] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d2e5dd63-9760-4dc5-a1c5-c0059691c35d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1655.353567] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1655.353567] nova-compute[62208]: warnings.warn( [ 1655.357213] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 1655.357213] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52d715dc-b61f-205c-5dd0-f4055aabd6cf" [ 1655.357213] nova-compute[62208]: _type = "Task" [ 1655.357213] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1655.361215] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1655.361215] nova-compute[62208]: warnings.warn( [ 1655.366695] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52d715dc-b61f-205c-5dd0-f4055aabd6cf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1655.861553] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1655.861553] nova-compute[62208]: warnings.warn( [ 1655.868448] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1655.868705] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1655.868917] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1656.985517] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10496d88-3d9e-446a-924e-fd4242b5d436 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1662.141823] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1664.140526] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1667.136622] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1668.140868] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1669.140753] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1669.141098] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1669.141340] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1669.161582] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1669.161745] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1669.161880] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1669.162008] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1669.162132] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1669.162251] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1669.162368] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1669.162483] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1669.162598] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1669.162711] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1669.162828] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1669.163332] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1669.173808] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1669.173808] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1669.173808] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1669.173808] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1669.174785] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3ec2992-adfb-4772-9343-e5b2d2eebc11 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1669.178058] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1669.178058] nova-compute[62208]: warnings.warn( [ 1669.183993] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-519e77af-96a5-467a-8c59-cec65ad878af {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1669.189568] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1669.189568] nova-compute[62208]: warnings.warn( [ 1669.200347] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7daffcd0-42f5-4e3d-ac79-224e11b0a228 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1669.202633] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1669.202633] nova-compute[62208]: warnings.warn( [ 1669.207218] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b5f99e9-75be-41b5-91dc-31a5fd62a374 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1669.218046] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1669.218046] nova-compute[62208]: warnings.warn( [ 1669.248130] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181952MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1669.248311] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1669.248524] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1669.326067] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c0d7e5a6-e905-47ee-87d7-cda8543be1f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1669.326288] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e45ad927-7d07-43d5-84b8-339c68981de6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1669.326458] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f000e638-100f-4a53-853d-4a94ffe71bed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1669.327651] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bbb642f2-aa5c-4e71-b25b-e32acd45f879 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1669.327651] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1669.327651] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 47cd2de6-8094-452e-afd7-aa42128a1b0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1669.327651] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1669.327651] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ad00920b-3783-4c01-bb25-4f923d29dad7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1669.327836] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ca1b4fca-a4bb-4a37-8e88-45e103a3579f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1669.328036] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 68b1024d-2bfd-4999-9ba2-f2558c223885 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1669.339524] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1669.349990] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1669.360423] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1669.372440] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 638c73fd-6c45-4f50-8078-2a61f3339ad2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1669.372739] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1669.373031] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1669.567548] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93baaf53-f0c3-4f37-945c-ce85bc8e020b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1669.570295] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1669.570295] nova-compute[62208]: warnings.warn( [ 1669.575550] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09190b6b-f587-44ac-bb9e-15ebe7c036bf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1669.579537] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1669.579537] nova-compute[62208]: warnings.warn( [ 1669.607999] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16bccce5-847b-4e6b-be3d-f86f66f9f816 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1669.610595] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1669.610595] nova-compute[62208]: warnings.warn( [ 1669.616723] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1120c132-b953-4d41-95e0-7a2f403fbeb9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1669.620880] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1669.620880] nova-compute[62208]: warnings.warn( [ 1669.631014] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1669.643959] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1669.661544] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1669.661769] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.413s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1670.639251] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1673.142339] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1675.141006] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1675.141401] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1701.194654] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1701.194654] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1701.194654] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1701.194654] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1701.194654] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1701.194654] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1701.194654] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1701.194654] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1701.194654] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1701.194654] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1701.194654] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1701.194654] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1701.195961] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/ce6178ab-8207-460f-98c2-12587f0e56d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1701.197229] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1701.197825] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Copying Virtual Disk [datastore2] vmware_temp/ce6178ab-8207-460f-98c2-12587f0e56d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/ce6178ab-8207-460f-98c2-12587f0e56d3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1701.197825] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-46f755c0-7d24-4e51-8afc-6bac545bb433 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.200527] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1701.200527] nova-compute[62208]: warnings.warn( [ 1701.207783] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 1701.207783] nova-compute[62208]: value = "task-38623" [ 1701.207783] nova-compute[62208]: _type = "Task" [ 1701.207783] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1701.211385] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1701.211385] nova-compute[62208]: warnings.warn( [ 1701.217376] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38623, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1701.711969] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1701.711969] nova-compute[62208]: warnings.warn( [ 1701.718230] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1701.718613] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1701.719309] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Traceback (most recent call last): [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] yield resources [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] self.driver.spawn(context, instance, image_meta, [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] self._fetch_image_if_missing(context, vi) [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] image_cache(vi, tmp_image_ds_loc) [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] vm_util.copy_virtual_disk( [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] session._wait_for_task(vmdk_copy_task) [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] return self.wait_for_task(task_ref) [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] return evt.wait() [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] result = hub.switch() [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] return self.greenlet.switch() [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] self.f(*self.args, **self.kw) [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] raise exceptions.translate_fault(task_info.error) [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Faults: ['InvalidArgument'] [ 1701.719309] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] [ 1701.720601] nova-compute[62208]: INFO nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Terminating instance [ 1701.721391] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1701.721618] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1701.721877] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ba86eaf6-0982-430c-9124-d29682f408fe {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.724377] nova-compute[62208]: DEBUG nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1701.724652] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1701.725402] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b82c5ea2-d206-4ec1-8a0e-57c24a9f0286 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.727766] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1701.727766] nova-compute[62208]: warnings.warn( [ 1701.728211] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1701.728211] nova-compute[62208]: warnings.warn( [ 1701.732595] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1701.732845] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6ac0ab53-bb22-4a15-9205-c36bc5718b84 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.735110] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1701.735277] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1701.735855] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1701.735855] nova-compute[62208]: warnings.warn( [ 1701.736298] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-39a99466-3987-4a92-895b-d666f39bc471 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.739209] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1701.739209] nova-compute[62208]: warnings.warn( [ 1701.742232] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 1701.742232] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52540f6c-6140-fc88-39cc-cab354e3459a" [ 1701.742232] nova-compute[62208]: _type = "Task" [ 1701.742232] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1701.744840] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1701.744840] nova-compute[62208]: warnings.warn( [ 1701.750095] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52540f6c-6140-fc88-39cc-cab354e3459a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1701.811772] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1701.811994] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1701.812203] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleting the datastore file [datastore2] c0d7e5a6-e905-47ee-87d7-cda8543be1f2 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1701.812469] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7bc3605c-f36c-41e8-bc16-0cd6cc20b405 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.814224] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1701.814224] nova-compute[62208]: warnings.warn( [ 1701.819468] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 1701.819468] nova-compute[62208]: value = "task-38625" [ 1701.819468] nova-compute[62208]: _type = "Task" [ 1701.819468] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1701.822738] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1701.822738] nova-compute[62208]: warnings.warn( [ 1701.827250] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38625, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1702.246312] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1702.246312] nova-compute[62208]: warnings.warn( [ 1702.253857] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1702.254119] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Creating directory with path [datastore2] vmware_temp/e6ee9164-88ba-4066-8473-b856688bec7e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1702.254353] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b604e98a-3d44-4846-a415-c0f71b8a0331 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.256962] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1702.256962] nova-compute[62208]: warnings.warn( [ 1702.267073] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Created directory with path [datastore2] vmware_temp/e6ee9164-88ba-4066-8473-b856688bec7e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1702.267298] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Fetch image to [datastore2] vmware_temp/e6ee9164-88ba-4066-8473-b856688bec7e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1702.267470] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/e6ee9164-88ba-4066-8473-b856688bec7e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1702.268325] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9129673a-050a-463f-b541-1bd111c6f9f3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.272040] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1702.272040] nova-compute[62208]: warnings.warn( [ 1702.276113] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11108d64-e7f2-4cf9-938a-ae41afac6cb3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.278899] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1702.278899] nova-compute[62208]: warnings.warn( [ 1702.287894] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c975b2df-391b-47a2-8032-b20b4b098e2a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.291797] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1702.291797] nova-compute[62208]: warnings.warn( [ 1702.328798] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2766e0dd-e31f-4222-aaa5-e437b113797f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.332356] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1702.332356] nova-compute[62208]: warnings.warn( [ 1702.333102] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1702.333102] nova-compute[62208]: warnings.warn( [ 1702.338585] nova-compute[62208]: DEBUG oslo_vmware.api [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38625, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066118} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1702.340421] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1702.340869] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1702.341235] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1702.341543] nova-compute[62208]: INFO nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1702.344333] nova-compute[62208]: DEBUG nova.compute.claims [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9376aea70> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1702.344644] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1702.344987] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1702.347948] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-32a5acfe-8145-47b3-8ad8-b1dad3055c30 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.350043] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1702.350043] nova-compute[62208]: warnings.warn( [ 1702.375902] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1702.538946] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e6ee9164-88ba-4066-8473-b856688bec7e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1702.595888] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1702.596327] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e6ee9164-88ba-4066-8473-b856688bec7e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1702.619313] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56abd5db-3792-4b0f-8053-74ea0748e487 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.621949] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1702.621949] nova-compute[62208]: warnings.warn( [ 1702.627541] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ca39038-3cac-480b-9817-53e3492c0af7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.630493] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1702.630493] nova-compute[62208]: warnings.warn( [ 1702.658520] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fee3f3e-4265-4692-b146-bdda20d03ae4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.660836] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1702.660836] nova-compute[62208]: warnings.warn( [ 1702.665609] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-067a0c67-19d3-463e-879d-79e75c5b7c00 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.669238] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1702.669238] nova-compute[62208]: warnings.warn( [ 1702.678542] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1702.688302] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1702.704341] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.359s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1702.704864] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Traceback (most recent call last): [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] self.driver.spawn(context, instance, image_meta, [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] self._fetch_image_if_missing(context, vi) [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] image_cache(vi, tmp_image_ds_loc) [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] vm_util.copy_virtual_disk( [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] session._wait_for_task(vmdk_copy_task) [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] return self.wait_for_task(task_ref) [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] return evt.wait() [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] result = hub.switch() [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] return self.greenlet.switch() [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] self.f(*self.args, **self.kw) [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] raise exceptions.translate_fault(task_info.error) [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Faults: ['InvalidArgument'] [ 1702.704864] nova-compute[62208]: ERROR nova.compute.manager [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] [ 1702.705926] nova-compute[62208]: DEBUG nova.compute.utils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1702.707051] nova-compute[62208]: DEBUG nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Build of instance c0d7e5a6-e905-47ee-87d7-cda8543be1f2 was re-scheduled: A specified parameter was not correct: fileType [ 1702.707051] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1702.707414] nova-compute[62208]: DEBUG nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1702.707585] nova-compute[62208]: DEBUG nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1702.707750] nova-compute[62208]: DEBUG nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1702.707908] nova-compute[62208]: DEBUG nova.network.neutron [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1703.000830] nova-compute[62208]: DEBUG nova.network.neutron [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1703.014938] nova-compute[62208]: INFO nova.compute.manager [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Took 0.31 seconds to deallocate network for instance. [ 1703.115420] nova-compute[62208]: INFO nova.scheduler.client.report [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleted allocations for instance c0d7e5a6-e905-47ee-87d7-cda8543be1f2 [ 1703.135748] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-c35ff189-1d3d-4afe-b155-8cdebd0b35f8 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "c0d7e5a6-e905-47ee-87d7-cda8543be1f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 534.990s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1703.137135] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-62685637-0469-4eea-9b07-53ea8fba9cca tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "c0d7e5a6-e905-47ee-87d7-cda8543be1f2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 338.660s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1703.137386] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-62685637-0469-4eea-9b07-53ea8fba9cca tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "c0d7e5a6-e905-47ee-87d7-cda8543be1f2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1703.137594] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-62685637-0469-4eea-9b07-53ea8fba9cca tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "c0d7e5a6-e905-47ee-87d7-cda8543be1f2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1703.137857] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-62685637-0469-4eea-9b07-53ea8fba9cca tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "c0d7e5a6-e905-47ee-87d7-cda8543be1f2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1703.140178] nova-compute[62208]: INFO nova.compute.manager [None req-62685637-0469-4eea-9b07-53ea8fba9cca tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Terminating instance [ 1703.142120] nova-compute[62208]: DEBUG nova.compute.manager [None req-62685637-0469-4eea-9b07-53ea8fba9cca tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1703.142358] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-62685637-0469-4eea-9b07-53ea8fba9cca tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1703.142899] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f95f5b48-04bd-405c-8caf-70c9e8efc831 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.145428] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1703.145428] nova-compute[62208]: warnings.warn( [ 1703.152818] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad494030-a00b-4e21-b85c-1d6dc47c6689 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.163810] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1703.163810] nova-compute[62208]: warnings.warn( [ 1703.182867] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-62685637-0469-4eea-9b07-53ea8fba9cca tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c0d7e5a6-e905-47ee-87d7-cda8543be1f2 could not be found. [ 1703.183139] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-62685637-0469-4eea-9b07-53ea8fba9cca tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1703.183276] nova-compute[62208]: INFO nova.compute.manager [None req-62685637-0469-4eea-9b07-53ea8fba9cca tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1703.183600] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-62685637-0469-4eea-9b07-53ea8fba9cca tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1703.184033] nova-compute[62208]: DEBUG nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1703.186521] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1703.186627] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1703.217240] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1703.231618] nova-compute[62208]: INFO nova.compute.manager [-] [instance: c0d7e5a6-e905-47ee-87d7-cda8543be1f2] Took 0.04 seconds to deallocate network for instance. [ 1703.242849] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1703.243122] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1703.244719] nova-compute[62208]: INFO nova.compute.claims [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1703.336090] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-62685637-0469-4eea-9b07-53ea8fba9cca tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "c0d7e5a6-e905-47ee-87d7-cda8543be1f2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.199s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1703.463485] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74080bc1-0dd6-4b96-905c-cb6c7b2bca50 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.466009] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1703.466009] nova-compute[62208]: warnings.warn( [ 1703.471457] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8afc6b7-8c6b-4552-aa4b-743c6a1d1d4c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.474888] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1703.474888] nova-compute[62208]: warnings.warn( [ 1703.502502] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69044f11-4949-466a-bdf6-f510502f0e67 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.504927] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1703.504927] nova-compute[62208]: warnings.warn( [ 1703.510186] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67075a18-ffad-4b4f-a571-66c5b1922de8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.514137] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1703.514137] nova-compute[62208]: warnings.warn( [ 1703.523746] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1703.531962] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1703.548879] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1703.549495] nova-compute[62208]: DEBUG nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1703.586887] nova-compute[62208]: DEBUG nova.compute.utils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1703.588161] nova-compute[62208]: DEBUG nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1703.588336] nova-compute[62208]: DEBUG nova.network.neutron [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1703.600461] nova-compute[62208]: DEBUG nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1703.633504] nova-compute[62208]: DEBUG nova.policy [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48cf6bc9785d46088589c14e7e8c14ca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '910bab22145d4f8cbd354ecf005eed6a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1703.678674] nova-compute[62208]: DEBUG nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1703.700424] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1703.700670] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1703.700831] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1703.701013] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1703.701160] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1703.701306] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1703.701509] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1703.701666] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1703.701832] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1703.701992] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1703.702163] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1703.703027] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd2be546-95c9-443f-812b-ff8fb6855b59 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.705452] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1703.705452] nova-compute[62208]: warnings.warn( [ 1703.711369] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b99ea111-c03b-481b-a4ea-adfa9fdd1716 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.715071] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1703.715071] nova-compute[62208]: warnings.warn( [ 1704.006583] nova-compute[62208]: DEBUG nova.network.neutron [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Successfully created port: 75903200-9fd7-4168-9ea8-eca1e084e9ec {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1704.536528] nova-compute[62208]: DEBUG nova.network.neutron [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Successfully updated port: 75903200-9fd7-4168-9ea8-eca1e084e9ec {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1704.549099] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "refresh_cache-c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1704.549250] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquired lock "refresh_cache-c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1704.549398] nova-compute[62208]: DEBUG nova.network.neutron [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1704.597388] nova-compute[62208]: DEBUG nova.network.neutron [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1704.756761] nova-compute[62208]: DEBUG nova.network.neutron [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Updating instance_info_cache with network_info: [{"id": "75903200-9fd7-4168-9ea8-eca1e084e9ec", "address": "fa:16:3e:ad:7b:2c", "network": {"id": "281fc297-5909-45d5-a98d-851fe6969333", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1363299703-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "910bab22145d4f8cbd354ecf005eed6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "10b81051-1eb1-406b-888c-4548c470c77e", "external-id": "nsx-vlan-transportzone-207", "segmentation_id": 207, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap75903200-9f", "ovs_interfaceid": "75903200-9fd7-4168-9ea8-eca1e084e9ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1704.771907] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Releasing lock "refresh_cache-c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1704.772237] nova-compute[62208]: DEBUG nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Instance network_info: |[{"id": "75903200-9fd7-4168-9ea8-eca1e084e9ec", "address": "fa:16:3e:ad:7b:2c", "network": {"id": "281fc297-5909-45d5-a98d-851fe6969333", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1363299703-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "910bab22145d4f8cbd354ecf005eed6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "10b81051-1eb1-406b-888c-4548c470c77e", "external-id": "nsx-vlan-transportzone-207", "segmentation_id": 207, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap75903200-9f", "ovs_interfaceid": "75903200-9fd7-4168-9ea8-eca1e084e9ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1704.772667] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ad:7b:2c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '10b81051-1eb1-406b-888c-4548c470c77e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '75903200-9fd7-4168-9ea8-eca1e084e9ec', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1704.780085] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1704.780850] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1704.781160] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4a059522-54c3-4baa-a04b-b402f16830e0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1704.796364] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1704.796364] nova-compute[62208]: warnings.warn( [ 1704.803705] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1704.803705] nova-compute[62208]: value = "task-38626" [ 1704.803705] nova-compute[62208]: _type = "Task" [ 1704.803705] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1704.806263] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1704.806263] nova-compute[62208]: warnings.warn( [ 1704.815712] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38626, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1705.041953] nova-compute[62208]: DEBUG nova.compute.manager [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Received event network-vif-plugged-75903200-9fd7-4168-9ea8-eca1e084e9ec {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1705.042194] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] Acquiring lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1705.042413] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] Lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1705.042590] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] Lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1705.042758] nova-compute[62208]: DEBUG nova.compute.manager [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] No waiting events found dispatching network-vif-plugged-75903200-9fd7-4168-9ea8-eca1e084e9ec {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1705.042925] nova-compute[62208]: WARNING nova.compute.manager [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Received unexpected event network-vif-plugged-75903200-9fd7-4168-9ea8-eca1e084e9ec for instance with vm_state building and task_state spawning. [ 1705.043084] nova-compute[62208]: DEBUG nova.compute.manager [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Received event network-changed-75903200-9fd7-4168-9ea8-eca1e084e9ec {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1705.043271] nova-compute[62208]: DEBUG nova.compute.manager [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Refreshing instance network info cache due to event network-changed-75903200-9fd7-4168-9ea8-eca1e084e9ec. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1705.043417] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] Acquiring lock "refresh_cache-c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1705.043550] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] Acquired lock "refresh_cache-c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1705.043706] nova-compute[62208]: DEBUG nova.network.neutron [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Refreshing network info cache for port 75903200-9fd7-4168-9ea8-eca1e084e9ec {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1705.287462] nova-compute[62208]: DEBUG nova.network.neutron [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Updated VIF entry in instance network info cache for port 75903200-9fd7-4168-9ea8-eca1e084e9ec. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1705.287823] nova-compute[62208]: DEBUG nova.network.neutron [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Updating instance_info_cache with network_info: [{"id": "75903200-9fd7-4168-9ea8-eca1e084e9ec", "address": "fa:16:3e:ad:7b:2c", "network": {"id": "281fc297-5909-45d5-a98d-851fe6969333", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1363299703-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "910bab22145d4f8cbd354ecf005eed6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "10b81051-1eb1-406b-888c-4548c470c77e", "external-id": "nsx-vlan-transportzone-207", "segmentation_id": 207, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap75903200-9f", "ovs_interfaceid": "75903200-9fd7-4168-9ea8-eca1e084e9ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1705.297961] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-9ccd883b-cecd-4fc3-b83f-a1ef646be738 req-9dd03219-a186-4287-a129-3604bea541b3 service nova] Releasing lock "refresh_cache-c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1705.307754] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1705.307754] nova-compute[62208]: warnings.warn( [ 1705.313617] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38626, 'name': CreateVM_Task, 'duration_secs': 0.321913} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1705.313784] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1705.314363] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1705.314586] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1705.317396] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3a13711-dbe5-4c15-91f0-73f879fb93b4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1705.327434] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1705.327434] nova-compute[62208]: warnings.warn( [ 1705.350980] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Reconfiguring VM instance to enable vnc on port - 5905 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1705.351358] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-8af6abf3-968b-4462-ad2d-dae8324baa2c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1705.361499] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1705.361499] nova-compute[62208]: warnings.warn( [ 1705.367023] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 1705.367023] nova-compute[62208]: value = "task-38627" [ 1705.367023] nova-compute[62208]: _type = "Task" [ 1705.367023] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1705.370039] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1705.370039] nova-compute[62208]: warnings.warn( [ 1705.375595] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38627, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1705.871376] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1705.871376] nova-compute[62208]: warnings.warn( [ 1705.877533] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38627, 'name': ReconfigVM_Task, 'duration_secs': 0.102649} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1705.877816] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Reconfigured VM instance to enable vnc on port - 5905 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1705.878072] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.563s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1705.878330] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1705.878478] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1705.878805] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1705.879099] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dc1c4933-25fb-4f4d-9f12-19a201cc131e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1705.880800] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1705.880800] nova-compute[62208]: warnings.warn( [ 1705.884209] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 1705.884209] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e19f4f-e69d-87fd-442e-9c9de4f92b04" [ 1705.884209] nova-compute[62208]: _type = "Task" [ 1705.884209] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1705.887186] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1705.887186] nova-compute[62208]: warnings.warn( [ 1705.892835] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e19f4f-e69d-87fd-442e-9c9de4f92b04, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1706.388497] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1706.388497] nova-compute[62208]: warnings.warn( [ 1706.395028] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1706.395339] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1706.395636] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1723.141886] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1726.142582] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1729.136360] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1729.141023] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1730.141129] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1730.151174] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1730.151405] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1730.151591] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1730.151759] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1730.152906] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9254b7fa-52af-4b33-9add-e1567b38b829 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.156149] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1730.156149] nova-compute[62208]: warnings.warn( [ 1730.162091] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6baaaa12-4654-482a-9989-e6de1908f77f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.165653] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1730.165653] nova-compute[62208]: warnings.warn( [ 1730.176117] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9df98aa3-dd33-464f-98e4-290f70069537 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.178543] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1730.178543] nova-compute[62208]: warnings.warn( [ 1730.183059] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0fc24af-3ba9-430d-9a7c-ce389a0bedfb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.186079] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1730.186079] nova-compute[62208]: warnings.warn( [ 1730.213121] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181956MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1730.213282] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1730.213465] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1730.336831] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e45ad927-7d07-43d5-84b8-339c68981de6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1730.337026] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f000e638-100f-4a53-853d-4a94ffe71bed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1730.337161] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bbb642f2-aa5c-4e71-b25b-e32acd45f879 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1730.337287] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1730.337409] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 47cd2de6-8094-452e-afd7-aa42128a1b0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1730.337529] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1730.337647] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ad00920b-3783-4c01-bb25-4f923d29dad7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1730.337762] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ca1b4fca-a4bb-4a37-8e88-45e103a3579f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1730.337876] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 68b1024d-2bfd-4999-9ba2-f2558c223885 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1730.338028] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1730.350033] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1730.360288] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1730.370410] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 638c73fd-6c45-4f50-8078-2a61f3339ad2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1730.370641] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1730.370788] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1730.386384] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing inventories for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1730.399765] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating ProviderTree inventory for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1730.399949] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating inventory in ProviderTree for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1730.410375] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing aggregate associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, aggregates: None {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1730.426302] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing trait associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1730.591588] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-726a6294-f768-413f-b90c-bf35368aef64 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.594290] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1730.594290] nova-compute[62208]: warnings.warn( [ 1730.599527] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d01bd7ca-781e-4675-a1ea-95dbc556605b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.602807] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1730.602807] nova-compute[62208]: warnings.warn( [ 1730.630527] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28c4c607-0832-4f5d-8a6e-b36650a223fd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.633546] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1730.633546] nova-compute[62208]: warnings.warn( [ 1730.638939] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ce190e1-1abd-494c-8b2f-f36c7c496396 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.643770] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1730.643770] nova-compute[62208]: warnings.warn( [ 1730.653408] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1730.662447] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1730.677816] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1730.678031] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.465s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1731.141623] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1731.142033] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1731.142033] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1731.162318] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1731.162491] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1731.162593] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1731.162717] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1731.162837] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1731.162956] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1731.163073] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1731.163190] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1731.163306] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1731.163421] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1731.163540] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1731.164171] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1731.164335] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances with incomplete migration {{(pid=62208) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11257}} [ 1732.148735] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1733.141956] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1734.141582] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1734.141992] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11219}} [ 1734.153790] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] There are 0 instances to clean {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11228}} [ 1735.154272] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1735.154638] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1737.136685] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1747.142845] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1749.589890] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1749.589890] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1749.589890] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1749.589890] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1749.589890] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1749.589890] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1749.589890] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1749.589890] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1749.589890] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1749.589890] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1749.589890] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1749.589890] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1749.590679] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/e6ee9164-88ba-4066-8473-b856688bec7e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1749.592493] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1749.592751] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Copying Virtual Disk [datastore2] vmware_temp/e6ee9164-88ba-4066-8473-b856688bec7e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/e6ee9164-88ba-4066-8473-b856688bec7e/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1749.593054] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e5d05c9e-9643-4607-b35c-c7cf664029af {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.595560] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1749.595560] nova-compute[62208]: warnings.warn( [ 1749.603124] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 1749.603124] nova-compute[62208]: value = "task-38628" [ 1749.603124] nova-compute[62208]: _type = "Task" [ 1749.603124] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1749.606266] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1749.606266] nova-compute[62208]: warnings.warn( [ 1749.611529] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38628, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1750.107820] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.107820] nova-compute[62208]: warnings.warn( [ 1750.113939] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1750.114223] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1750.114775] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Traceback (most recent call last): [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] yield resources [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] self.driver.spawn(context, instance, image_meta, [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] self._fetch_image_if_missing(context, vi) [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] image_cache(vi, tmp_image_ds_loc) [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] vm_util.copy_virtual_disk( [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] session._wait_for_task(vmdk_copy_task) [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] return self.wait_for_task(task_ref) [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] return evt.wait() [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] result = hub.switch() [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] return self.greenlet.switch() [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] self.f(*self.args, **self.kw) [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] raise exceptions.translate_fault(task_info.error) [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Faults: ['InvalidArgument'] [ 1750.114775] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] [ 1750.116017] nova-compute[62208]: INFO nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Terminating instance [ 1750.116719] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1750.116927] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1750.117170] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-361e3ba7-6920-469f-9a5e-18a55e75b287 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.119679] nova-compute[62208]: DEBUG nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1750.119864] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1750.120606] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b291d37-6f3b-412e-89f8-076e445efbbf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.122808] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.122808] nova-compute[62208]: warnings.warn( [ 1750.123235] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.123235] nova-compute[62208]: warnings.warn( [ 1750.127451] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1750.127690] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-230f5c92-29d2-41ad-9026-7a6b5705ee21 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.129931] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1750.130099] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1750.130656] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.130656] nova-compute[62208]: warnings.warn( [ 1750.131039] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-323a0036-4176-4122-bb9f-469833422b09 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.133069] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.133069] nova-compute[62208]: warnings.warn( [ 1750.136400] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for the task: (returnval){ [ 1750.136400] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5228a352-59ed-317b-0879-558175b41ab0" [ 1750.136400] nova-compute[62208]: _type = "Task" [ 1750.136400] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1750.139097] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.139097] nova-compute[62208]: warnings.warn( [ 1750.143630] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5228a352-59ed-317b-0879-558175b41ab0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1750.196536] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1750.196780] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1750.196935] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Deleting the datastore file [datastore2] e45ad927-7d07-43d5-84b8-339c68981de6 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1750.197201] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4802536f-13fe-41d5-bc4e-7ef2fd531472 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.198991] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.198991] nova-compute[62208]: warnings.warn( [ 1750.203378] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 1750.203378] nova-compute[62208]: value = "task-38630" [ 1750.203378] nova-compute[62208]: _type = "Task" [ 1750.203378] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1750.207813] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.207813] nova-compute[62208]: warnings.warn( [ 1750.212760] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38630, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1750.640445] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.640445] nova-compute[62208]: warnings.warn( [ 1750.646902] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1750.647167] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Creating directory with path [datastore2] vmware_temp/cb81f4c7-1dd1-4344-9326-77145f85e49b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1750.647401] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0ea6b04c-67f4-4d89-858e-058c17056764 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.649146] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.649146] nova-compute[62208]: warnings.warn( [ 1750.659548] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Created directory with path [datastore2] vmware_temp/cb81f4c7-1dd1-4344-9326-77145f85e49b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1750.659752] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Fetch image to [datastore2] vmware_temp/cb81f4c7-1dd1-4344-9326-77145f85e49b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1750.659922] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/cb81f4c7-1dd1-4344-9326-77145f85e49b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1750.660777] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce686862-c97f-4a4b-9a62-9e8b2d9a6176 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.663841] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.663841] nova-compute[62208]: warnings.warn( [ 1750.669191] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c98bd6d-3ade-4f0c-9908-8ac559b1e925 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.671687] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.671687] nova-compute[62208]: warnings.warn( [ 1750.679193] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6704de95-09fe-4565-99fb-1939375327a7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.682893] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.682893] nova-compute[62208]: warnings.warn( [ 1750.712477] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a0d0c95-4b5d-438e-9060-d60807f947f2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.714623] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.714623] nova-compute[62208]: warnings.warn( [ 1750.714974] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.714974] nova-compute[62208]: warnings.warn( [ 1750.720356] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38630, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078371} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1750.722010] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1750.722214] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1750.722387] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1750.722564] nova-compute[62208]: INFO nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1750.724611] nova-compute[62208]: DEBUG nova.compute.claims [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9365d1e70> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1750.724785] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.725010] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1750.727601] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9a86db1b-410d-4841-855a-1af9197c9c09 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.729544] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.729544] nova-compute[62208]: warnings.warn( [ 1750.754547] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1750.803950] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cb81f4c7-1dd1-4344-9326-77145f85e49b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1750.861383] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1750.861581] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cb81f4c7-1dd1-4344-9326-77145f85e49b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1750.993241] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff86117e-9359-44ba-9179-05dfad47177c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.996085] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1750.996085] nova-compute[62208]: warnings.warn( [ 1751.001506] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2c00040-32ef-41d8-9d04-e24dc49c450b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.004413] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1751.004413] nova-compute[62208]: warnings.warn( [ 1751.786087] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dd41542-8614-4e92-8be9-421663b84dca {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.788728] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1751.788728] nova-compute[62208]: warnings.warn( [ 1751.794135] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e95efe1-dbcd-44e5-89f8-0b8e135117b0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.798467] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1751.798467] nova-compute[62208]: warnings.warn( [ 1751.808127] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1751.816700] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1751.837036] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 1.112s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1751.838230] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Traceback (most recent call last): [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] self.driver.spawn(context, instance, image_meta, [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] self._fetch_image_if_missing(context, vi) [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] image_cache(vi, tmp_image_ds_loc) [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] vm_util.copy_virtual_disk( [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] session._wait_for_task(vmdk_copy_task) [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] return self.wait_for_task(task_ref) [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] return evt.wait() [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] result = hub.switch() [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] return self.greenlet.switch() [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] self.f(*self.args, **self.kw) [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] raise exceptions.translate_fault(task_info.error) [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Faults: ['InvalidArgument'] [ 1751.838230] nova-compute[62208]: ERROR nova.compute.manager [instance: e45ad927-7d07-43d5-84b8-339c68981de6] [ 1751.840375] nova-compute[62208]: DEBUG nova.compute.utils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1751.842343] nova-compute[62208]: DEBUG nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Build of instance e45ad927-7d07-43d5-84b8-339c68981de6 was re-scheduled: A specified parameter was not correct: fileType [ 1751.842343] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1751.843006] nova-compute[62208]: DEBUG nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1751.843332] nova-compute[62208]: DEBUG nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1751.843700] nova-compute[62208]: DEBUG nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1751.843995] nova-compute[62208]: DEBUG nova.network.neutron [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1752.149771] nova-compute[62208]: DEBUG nova.network.neutron [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1752.164842] nova-compute[62208]: INFO nova.compute.manager [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Took 0.32 seconds to deallocate network for instance. [ 1752.263570] nova-compute[62208]: INFO nova.scheduler.client.report [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Deleted allocations for instance e45ad927-7d07-43d5-84b8-339c68981de6 [ 1752.288301] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2749e1d-456c-4dd8-93cb-2ff052264a7c tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "e45ad927-7d07-43d5-84b8-339c68981de6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 549.151s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1752.289527] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2eae14b9-3a91-464e-b7c4-049bb571b3c2 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "e45ad927-7d07-43d5-84b8-339c68981de6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 353.328s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1752.289746] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2eae14b9-3a91-464e-b7c4-049bb571b3c2 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "e45ad927-7d07-43d5-84b8-339c68981de6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1752.289948] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2eae14b9-3a91-464e-b7c4-049bb571b3c2 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "e45ad927-7d07-43d5-84b8-339c68981de6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1752.290114] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2eae14b9-3a91-464e-b7c4-049bb571b3c2 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "e45ad927-7d07-43d5-84b8-339c68981de6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1752.292311] nova-compute[62208]: INFO nova.compute.manager [None req-2eae14b9-3a91-464e-b7c4-049bb571b3c2 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Terminating instance [ 1752.294241] nova-compute[62208]: DEBUG nova.compute.manager [None req-2eae14b9-3a91-464e-b7c4-049bb571b3c2 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1752.294487] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2eae14b9-3a91-464e-b7c4-049bb571b3c2 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1752.295035] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-89615332-f426-4649-a216-00a037d9e1ea {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.297480] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1752.297480] nova-compute[62208]: warnings.warn( [ 1752.304850] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0caa03a6-8d57-4f03-a4a3-52b276d72f3b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.315621] nova-compute[62208]: DEBUG nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1752.318332] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1752.318332] nova-compute[62208]: warnings.warn( [ 1752.336950] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-2eae14b9-3a91-464e-b7c4-049bb571b3c2 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e45ad927-7d07-43d5-84b8-339c68981de6 could not be found. [ 1752.337155] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2eae14b9-3a91-464e-b7c4-049bb571b3c2 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1752.337330] nova-compute[62208]: INFO nova.compute.manager [None req-2eae14b9-3a91-464e-b7c4-049bb571b3c2 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1752.337568] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-2eae14b9-3a91-464e-b7c4-049bb571b3c2 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1752.337777] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1752.337871] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1752.365393] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1752.368343] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1752.368572] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1752.370039] nova-compute[62208]: INFO nova.compute.claims [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1752.373436] nova-compute[62208]: INFO nova.compute.manager [-] [instance: e45ad927-7d07-43d5-84b8-339c68981de6] Took 0.04 seconds to deallocate network for instance. [ 1752.465045] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2eae14b9-3a91-464e-b7c4-049bb571b3c2 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "e45ad927-7d07-43d5-84b8-339c68981de6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.175s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1752.568931] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6667f8f-68ff-4b4a-a707-b2e9f59fa1e4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.571429] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1752.571429] nova-compute[62208]: warnings.warn( [ 1752.576566] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26b3ec7d-6491-4ad8-b224-5038f4fc854a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.579665] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1752.579665] nova-compute[62208]: warnings.warn( [ 1752.605397] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e2239f8-e4c2-4dd5-824c-e0134081dda8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.607698] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1752.607698] nova-compute[62208]: warnings.warn( [ 1752.613116] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7bad674-2c1a-4c93-8ec4-99e04deb20b0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.616670] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1752.616670] nova-compute[62208]: warnings.warn( [ 1752.627145] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1752.636874] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1752.656720] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1752.657244] nova-compute[62208]: DEBUG nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1752.693829] nova-compute[62208]: DEBUG nova.compute.utils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1752.695182] nova-compute[62208]: DEBUG nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1752.695601] nova-compute[62208]: DEBUG nova.network.neutron [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1752.706639] nova-compute[62208]: DEBUG nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1752.741708] nova-compute[62208]: DEBUG nova.policy [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '931207206d284e4db60fd5aabf7648f8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '178d7b9219794edc8a3f6879910bab4b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1752.779898] nova-compute[62208]: DEBUG nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1752.802896] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1752.803229] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1752.803406] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1752.803624] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1752.803773] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1752.803920] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1752.804141] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1752.804302] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1752.804470] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1752.804629] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1752.804798] nova-compute[62208]: DEBUG nova.virt.hardware [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1752.805645] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c59f95cf-9905-425c-8bfc-d7d8764880b4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.809127] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1752.809127] nova-compute[62208]: warnings.warn( [ 1752.816931] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-753f5b0a-1d3a-4efc-aca6-99f11902a4ce {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.821563] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1752.821563] nova-compute[62208]: warnings.warn( [ 1753.234625] nova-compute[62208]: DEBUG nova.network.neutron [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Successfully created port: 80490b79-8e18-4406-be0e-d2bdfa514b4c {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1753.808155] nova-compute[62208]: DEBUG nova.network.neutron [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Successfully updated port: 80490b79-8e18-4406-be0e-d2bdfa514b4c {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1753.822658] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "refresh_cache-70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1753.822821] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquired lock "refresh_cache-70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1753.822975] nova-compute[62208]: DEBUG nova.network.neutron [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1753.864655] nova-compute[62208]: DEBUG nova.network.neutron [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1754.085131] nova-compute[62208]: DEBUG nova.network.neutron [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Updating instance_info_cache with network_info: [{"id": "80490b79-8e18-4406-be0e-d2bdfa514b4c", "address": "fa:16:3e:b1:1c:a1", "network": {"id": "2a58a4a4-f5e7-409c-9873-19d373cb9375", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-608537532-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "178d7b9219794edc8a3f6879910bab4b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e5d88cd9-35a3-4ac3-9d6d-756464cd6cc5", "external-id": "nsx-vlan-transportzone-685", "segmentation_id": 685, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap80490b79-8e", "ovs_interfaceid": "80490b79-8e18-4406-be0e-d2bdfa514b4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1754.099594] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Releasing lock "refresh_cache-70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1754.099988] nova-compute[62208]: DEBUG nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Instance network_info: |[{"id": "80490b79-8e18-4406-be0e-d2bdfa514b4c", "address": "fa:16:3e:b1:1c:a1", "network": {"id": "2a58a4a4-f5e7-409c-9873-19d373cb9375", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-608537532-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "178d7b9219794edc8a3f6879910bab4b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e5d88cd9-35a3-4ac3-9d6d-756464cd6cc5", "external-id": "nsx-vlan-transportzone-685", "segmentation_id": 685, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap80490b79-8e", "ovs_interfaceid": "80490b79-8e18-4406-be0e-d2bdfa514b4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1754.100700] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b1:1c:a1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e5d88cd9-35a3-4ac3-9d6d-756464cd6cc5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '80490b79-8e18-4406-be0e-d2bdfa514b4c', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1754.108329] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1754.109537] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1754.109980] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e561eaca-acdb-426a-a02c-de1db66d2e30 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1754.124074] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1754.124074] nova-compute[62208]: warnings.warn( [ 1754.131159] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1754.131159] nova-compute[62208]: value = "task-38631" [ 1754.131159] nova-compute[62208]: _type = "Task" [ 1754.131159] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1754.134973] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1754.134973] nova-compute[62208]: warnings.warn( [ 1754.140647] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38631, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1754.193157] nova-compute[62208]: DEBUG nova.compute.manager [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Received event network-vif-plugged-80490b79-8e18-4406-be0e-d2bdfa514b4c {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1754.193395] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] Acquiring lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1754.193698] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] Lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1754.193938] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] Lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1754.194161] nova-compute[62208]: DEBUG nova.compute.manager [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] No waiting events found dispatching network-vif-plugged-80490b79-8e18-4406-be0e-d2bdfa514b4c {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1754.194341] nova-compute[62208]: WARNING nova.compute.manager [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Received unexpected event network-vif-plugged-80490b79-8e18-4406-be0e-d2bdfa514b4c for instance with vm_state building and task_state spawning. [ 1754.194505] nova-compute[62208]: DEBUG nova.compute.manager [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Received event network-changed-80490b79-8e18-4406-be0e-d2bdfa514b4c {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1754.194661] nova-compute[62208]: DEBUG nova.compute.manager [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Refreshing instance network info cache due to event network-changed-80490b79-8e18-4406-be0e-d2bdfa514b4c. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1754.194852] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] Acquiring lock "refresh_cache-70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1754.194988] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] Acquired lock "refresh_cache-70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1754.195149] nova-compute[62208]: DEBUG nova.network.neutron [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Refreshing network info cache for port 80490b79-8e18-4406-be0e-d2bdfa514b4c {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1754.504726] nova-compute[62208]: DEBUG nova.network.neutron [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Updated VIF entry in instance network info cache for port 80490b79-8e18-4406-be0e-d2bdfa514b4c. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1754.505111] nova-compute[62208]: DEBUG nova.network.neutron [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Updating instance_info_cache with network_info: [{"id": "80490b79-8e18-4406-be0e-d2bdfa514b4c", "address": "fa:16:3e:b1:1c:a1", "network": {"id": "2a58a4a4-f5e7-409c-9873-19d373cb9375", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-608537532-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "178d7b9219794edc8a3f6879910bab4b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e5d88cd9-35a3-4ac3-9d6d-756464cd6cc5", "external-id": "nsx-vlan-transportzone-685", "segmentation_id": 685, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap80490b79-8e", "ovs_interfaceid": "80490b79-8e18-4406-be0e-d2bdfa514b4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1754.515171] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-33659f82-42ab-4788-8a47-3107808788dc req-a447461b-235c-44bc-a5ce-96466ff81e0e service nova] Releasing lock "refresh_cache-70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1754.636267] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1754.636267] nova-compute[62208]: warnings.warn( [ 1754.643134] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38631, 'name': CreateVM_Task, 'duration_secs': 0.306019} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1754.643423] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1754.644318] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1754.644790] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1754.649291] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9416e8f-1120-4c07-966d-0fb89d1dc837 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1754.665158] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1754.665158] nova-compute[62208]: warnings.warn( [ 1754.695243] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Reconfiguring VM instance to enable vnc on port - 5906 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1754.695691] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-8229c378-28fe-4db4-a76c-7fa844c2f56e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1754.706071] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1754.706071] nova-compute[62208]: warnings.warn( [ 1754.712344] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for the task: (returnval){ [ 1754.712344] nova-compute[62208]: value = "task-38632" [ 1754.712344] nova-compute[62208]: _type = "Task" [ 1754.712344] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1754.715835] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1754.715835] nova-compute[62208]: warnings.warn( [ 1754.721305] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': task-38632, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1755.217530] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1755.217530] nova-compute[62208]: warnings.warn( [ 1755.224669] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': task-38632, 'name': ReconfigVM_Task, 'duration_secs': 0.1116} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1755.224875] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Reconfigured VM instance to enable vnc on port - 5906 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1755.225100] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.581s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1755.225354] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1755.225502] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1755.225826] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1755.226104] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6010c0ca-69eb-4c06-bd2c-17333c214d86 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.228177] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1755.228177] nova-compute[62208]: warnings.warn( [ 1755.232306] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for the task: (returnval){ [ 1755.232306] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]521af4f6-dd97-5fe3-8e25-0115dda9a813" [ 1755.232306] nova-compute[62208]: _type = "Task" [ 1755.232306] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1755.235414] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1755.235414] nova-compute[62208]: warnings.warn( [ 1755.240710] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]521af4f6-dd97-5fe3-8e25-0115dda9a813, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1755.736687] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1755.736687] nova-compute[62208]: warnings.warn( [ 1755.743136] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1755.743419] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1755.743635] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1757.181699] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "e0a444fc-dca2-419a-9ac1-8d71048e1690" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1757.182040] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "e0a444fc-dca2-419a-9ac1-8d71048e1690" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1757.567289] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-480b7895-0275-4815-9d28-6f983125e91d tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "68b1024d-2bfd-4999-9ba2-f2558c223885" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1783.150866] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1786.141243] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1790.141701] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1790.142042] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1790.152571] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1790.152798] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1790.153731] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1790.153731] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1790.154202] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea69492c-f04c-40f6-a2d9-3fd5bdda5fa0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.157184] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1790.157184] nova-compute[62208]: warnings.warn( [ 1790.163182] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cc3d695-93ca-4c33-b739-5a3d3880723d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.166779] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1790.166779] nova-compute[62208]: warnings.warn( [ 1790.177269] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c082eae-f06e-441e-8d2a-adbe79c50185 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.179520] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1790.179520] nova-compute[62208]: warnings.warn( [ 1790.183952] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b92bbcd3-9340-4174-8183-592dc9170099 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.186937] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1790.186937] nova-compute[62208]: warnings.warn( [ 1790.212802] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181969MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1790.212998] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1790.213162] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1790.277741] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance f000e638-100f-4a53-853d-4a94ffe71bed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1790.277909] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance bbb642f2-aa5c-4e71-b25b-e32acd45f879 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1790.278079] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1790.278214] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 47cd2de6-8094-452e-afd7-aa42128a1b0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1790.278336] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1790.278454] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ad00920b-3783-4c01-bb25-4f923d29dad7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1790.278570] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ca1b4fca-a4bb-4a37-8e88-45e103a3579f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1790.278682] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 68b1024d-2bfd-4999-9ba2-f2558c223885 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1790.278800] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1790.278915] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1790.290748] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1790.301785] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 638c73fd-6c45-4f50-8078-2a61f3339ad2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1790.312336] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e0a444fc-dca2-419a-9ac1-8d71048e1690 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1790.312617] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1790.312721] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1790.480170] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddf7be94-7acc-4bea-be25-9ae99617990f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.482603] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1790.482603] nova-compute[62208]: warnings.warn( [ 1790.488255] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9393281e-6648-4246-8260-bb9f33e706a0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.491294] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1790.491294] nova-compute[62208]: warnings.warn( [ 1790.518572] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7402101b-08a3-4804-9b3d-1e4c182b0b76 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.521125] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1790.521125] nova-compute[62208]: warnings.warn( [ 1790.526745] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cee6037-7f7b-40ea-af66-2a08ba358dc9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.530687] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1790.530687] nova-compute[62208]: warnings.warn( [ 1790.540470] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1790.548580] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1790.565470] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1790.565677] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.352s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1791.560597] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1792.140613] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1792.140814] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1792.140948] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1792.161885] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1792.162089] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1792.162173] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1792.162302] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1792.162425] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1792.162548] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1792.162666] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1792.162784] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1792.162900] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1792.163015] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1792.163134] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1792.163689] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1792.385128] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7c0d7baf-e1ca-4808-ae40-df28a2beec42 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1794.141376] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1796.141132] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1796.141383] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1797.012462] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1797.012462] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1797.012462] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1797.012462] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1797.012462] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1797.012462] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1797.012462] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1797.012462] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1797.012462] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1797.012462] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1797.012462] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1797.012462] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1797.013018] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/cb81f4c7-1dd1-4344-9326-77145f85e49b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1797.015082] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1797.015328] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Copying Virtual Disk [datastore2] vmware_temp/cb81f4c7-1dd1-4344-9326-77145f85e49b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/cb81f4c7-1dd1-4344-9326-77145f85e49b/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1797.015625] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-441e75be-144c-4896-8a1a-abc420049d4d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.018305] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1797.018305] nova-compute[62208]: warnings.warn( [ 1797.024744] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for the task: (returnval){ [ 1797.024744] nova-compute[62208]: value = "task-38633" [ 1797.024744] nova-compute[62208]: _type = "Task" [ 1797.024744] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1797.027806] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1797.027806] nova-compute[62208]: warnings.warn( [ 1797.033350] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': task-38633, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1797.529564] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1797.529564] nova-compute[62208]: warnings.warn( [ 1797.536243] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1797.536535] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1797.537153] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Traceback (most recent call last): [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] yield resources [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] self.driver.spawn(context, instance, image_meta, [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] self._fetch_image_if_missing(context, vi) [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] image_cache(vi, tmp_image_ds_loc) [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] vm_util.copy_virtual_disk( [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] session._wait_for_task(vmdk_copy_task) [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] return self.wait_for_task(task_ref) [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] return evt.wait() [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] result = hub.switch() [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] return self.greenlet.switch() [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] self.f(*self.args, **self.kw) [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] raise exceptions.translate_fault(task_info.error) [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Faults: ['InvalidArgument'] [ 1797.537153] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] [ 1797.538368] nova-compute[62208]: INFO nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Terminating instance [ 1797.539045] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1797.539825] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1797.539825] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-280187fc-da2e-4f78-b0cc-37c836eace44 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.541874] nova-compute[62208]: DEBUG nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1797.542065] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1797.542790] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f45785c1-d1ff-4593-9300-d9eeb7b59e24 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.545963] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1797.545963] nova-compute[62208]: warnings.warn( [ 1797.546348] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1797.546348] nova-compute[62208]: warnings.warn( [ 1797.550910] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1797.551243] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e3d26f66-41e4-4e11-8367-d4502c4fdb48 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.553771] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1797.553955] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1797.554574] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1797.554574] nova-compute[62208]: warnings.warn( [ 1797.554998] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-16a78806-265b-4fff-a9f6-7c16e366dc13 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.557458] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1797.557458] nova-compute[62208]: warnings.warn( [ 1797.560669] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 1797.560669] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]524e3b6e-ad7e-2a54-6ea0-1fbdaf1cb9f4" [ 1797.560669] nova-compute[62208]: _type = "Task" [ 1797.560669] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1797.564085] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1797.564085] nova-compute[62208]: warnings.warn( [ 1797.569149] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]524e3b6e-ad7e-2a54-6ea0-1fbdaf1cb9f4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1797.632036] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1797.632417] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1797.632633] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Deleting the datastore file [datastore2] f000e638-100f-4a53-853d-4a94ffe71bed {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1797.632901] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-465d6c37-788c-4662-8ee6-a00f5f07a9e8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.634802] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1797.634802] nova-compute[62208]: warnings.warn( [ 1797.640068] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for the task: (returnval){ [ 1797.640068] nova-compute[62208]: value = "task-38635" [ 1797.640068] nova-compute[62208]: _type = "Task" [ 1797.640068] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1797.645024] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1797.645024] nova-compute[62208]: warnings.warn( [ 1797.650352] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': task-38635, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1798.065488] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.065488] nova-compute[62208]: warnings.warn( [ 1798.071787] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1798.072158] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating directory with path [datastore2] vmware_temp/be84bf86-0799-4168-922d-36b54b210143/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1798.072473] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-40455410-fab4-4d86-8a0d-1e416386b982 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.074182] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.074182] nova-compute[62208]: warnings.warn( [ 1798.084696] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created directory with path [datastore2] vmware_temp/be84bf86-0799-4168-922d-36b54b210143/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1798.085009] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Fetch image to [datastore2] vmware_temp/be84bf86-0799-4168-922d-36b54b210143/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1798.085248] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/be84bf86-0799-4168-922d-36b54b210143/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1798.086101] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8585742a-590e-407e-9c8f-8cb0021a0013 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.088581] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.088581] nova-compute[62208]: warnings.warn( [ 1798.093406] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7692264a-8898-48ef-bf70-7d52e1e336fe {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.095739] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.095739] nova-compute[62208]: warnings.warn( [ 1798.103114] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5b85895-c5d6-4690-bd96-83d355601483 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.106807] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.106807] nova-compute[62208]: warnings.warn( [ 1798.135447] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a21e59b9-0590-4521-9fdf-a1c1e3cd82b2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.138040] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.138040] nova-compute[62208]: warnings.warn( [ 1798.145165] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c929b9fd-b6ec-41ec-946b-d94cb4462cbd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.146902] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.146902] nova-compute[62208]: warnings.warn( [ 1798.147330] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.147330] nova-compute[62208]: warnings.warn( [ 1798.152327] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': task-38635, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070835} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1798.152709] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1798.152991] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1798.153237] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1798.153496] nova-compute[62208]: INFO nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1798.155846] nova-compute[62208]: DEBUG nova.compute.claims [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Aborting claim: <nova.compute.claims.Claim object at 0x7fb937513670> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1798.156131] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1798.156436] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1798.167641] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1798.218390] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/be84bf86-0799-4168-922d-36b54b210143/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1798.275314] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1798.275530] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/be84bf86-0799-4168-922d-36b54b210143/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1798.416125] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e17dc0f-6830-4e7c-8cdd-0069512046d6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.418889] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.418889] nova-compute[62208]: warnings.warn( [ 1798.424538] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e01abf4-0043-42f4-88d7-069c8ed248e5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.427534] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.427534] nova-compute[62208]: warnings.warn( [ 1798.456043] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5447484-e4a2-44c2-9ef4-8ad8ac5bd506 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.457682] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.457682] nova-compute[62208]: warnings.warn( [ 1798.463095] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1da9c50d-ae8c-42e3-a8fc-bd39c5b79552 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.466936] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.466936] nova-compute[62208]: warnings.warn( [ 1798.477306] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1798.485761] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1798.502957] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.346s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1798.503503] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Traceback (most recent call last): [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] self.driver.spawn(context, instance, image_meta, [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] self._fetch_image_if_missing(context, vi) [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] image_cache(vi, tmp_image_ds_loc) [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] vm_util.copy_virtual_disk( [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] session._wait_for_task(vmdk_copy_task) [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] return self.wait_for_task(task_ref) [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] return evt.wait() [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] result = hub.switch() [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] return self.greenlet.switch() [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] self.f(*self.args, **self.kw) [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] raise exceptions.translate_fault(task_info.error) [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Faults: ['InvalidArgument'] [ 1798.503503] nova-compute[62208]: ERROR nova.compute.manager [instance: f000e638-100f-4a53-853d-4a94ffe71bed] [ 1798.504576] nova-compute[62208]: DEBUG nova.compute.utils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1798.506036] nova-compute[62208]: DEBUG nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Build of instance f000e638-100f-4a53-853d-4a94ffe71bed was re-scheduled: A specified parameter was not correct: fileType [ 1798.506036] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1798.506451] nova-compute[62208]: DEBUG nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1798.506630] nova-compute[62208]: DEBUG nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1798.506985] nova-compute[62208]: DEBUG nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1798.507163] nova-compute[62208]: DEBUG nova.network.neutron [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1798.794084] nova-compute[62208]: DEBUG nova.network.neutron [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1798.806636] nova-compute[62208]: INFO nova.compute.manager [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Took 0.30 seconds to deallocate network for instance. [ 1798.902524] nova-compute[62208]: INFO nova.scheduler.client.report [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Deleted allocations for instance f000e638-100f-4a53-853d-4a94ffe71bed [ 1798.921874] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d48e52e1-3bcb-43fa-843a-c4e80e468b04 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "f000e638-100f-4a53-853d-4a94ffe71bed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 585.980s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1798.923421] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-dbaf8a69-13eb-4b0f-9f97-d6be8b8ba1da tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "f000e638-100f-4a53-853d-4a94ffe71bed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 389.745s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1798.923652] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-dbaf8a69-13eb-4b0f-9f97-d6be8b8ba1da tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "f000e638-100f-4a53-853d-4a94ffe71bed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1798.923859] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-dbaf8a69-13eb-4b0f-9f97-d6be8b8ba1da tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "f000e638-100f-4a53-853d-4a94ffe71bed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1798.924151] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-dbaf8a69-13eb-4b0f-9f97-d6be8b8ba1da tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "f000e638-100f-4a53-853d-4a94ffe71bed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1798.926084] nova-compute[62208]: INFO nova.compute.manager [None req-dbaf8a69-13eb-4b0f-9f97-d6be8b8ba1da tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Terminating instance [ 1798.927906] nova-compute[62208]: DEBUG nova.compute.manager [None req-dbaf8a69-13eb-4b0f-9f97-d6be8b8ba1da tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1798.928203] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-dbaf8a69-13eb-4b0f-9f97-d6be8b8ba1da tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1798.928732] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ce1c6275-e1f8-4228-8700-4f3d50290c46 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.931305] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.931305] nova-compute[62208]: warnings.warn( [ 1798.939257] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77fdd27f-e463-4ae4-9b45-7f0c8d39bc1e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.951143] nova-compute[62208]: DEBUG nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1798.954138] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1798.954138] nova-compute[62208]: warnings.warn( [ 1798.973110] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-dbaf8a69-13eb-4b0f-9f97-d6be8b8ba1da tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f000e638-100f-4a53-853d-4a94ffe71bed could not be found. [ 1798.973326] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-dbaf8a69-13eb-4b0f-9f97-d6be8b8ba1da tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1798.973509] nova-compute[62208]: INFO nova.compute.manager [None req-dbaf8a69-13eb-4b0f-9f97-d6be8b8ba1da tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1798.973749] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-dbaf8a69-13eb-4b0f-9f97-d6be8b8ba1da tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1798.973980] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1798.974076] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1799.003112] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1799.005311] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.005556] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1799.007089] nova-compute[62208]: INFO nova.compute.claims [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1799.012454] nova-compute[62208]: INFO nova.compute.manager [-] [instance: f000e638-100f-4a53-853d-4a94ffe71bed] Took 0.04 seconds to deallocate network for instance. [ 1799.114130] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-dbaf8a69-13eb-4b0f-9f97-d6be8b8ba1da tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "f000e638-100f-4a53-853d-4a94ffe71bed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.191s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1799.228850] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d3f9100-3568-4b60-8f6e-d605b9e3ce74 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.231402] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1799.231402] nova-compute[62208]: warnings.warn( [ 1799.236496] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14513ed2-efb2-4cee-9499-6e3f3f40404f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.241433] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1799.241433] nova-compute[62208]: warnings.warn( [ 1799.268172] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0388a95b-0e48-474a-a3a9-d6c22cb2b598 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.270532] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1799.270532] nova-compute[62208]: warnings.warn( [ 1799.275511] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-730da69b-9ac6-407c-88a1-b53ac6b0d2cb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.279648] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1799.279648] nova-compute[62208]: warnings.warn( [ 1799.288809] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1799.297811] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1799.314566] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.309s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1799.315097] nova-compute[62208]: DEBUG nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1799.348539] nova-compute[62208]: DEBUG nova.compute.utils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1799.350227] nova-compute[62208]: DEBUG nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1799.350417] nova-compute[62208]: DEBUG nova.network.neutron [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1799.359726] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_power_states {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1799.364291] nova-compute[62208]: DEBUG nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1799.380636] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Getting list of instances from cluster (obj){ [ 1799.380636] nova-compute[62208]: value = "domain-c8" [ 1799.380636] nova-compute[62208]: _type = "ClusterComputeResource" [ 1799.380636] nova-compute[62208]: } {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1799.381914] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bd599a6-5354-4b61-96b3-f9aea18ae22c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.385175] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1799.385175] nova-compute[62208]: warnings.warn( [ 1799.401494] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Got total of 9 instances {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1799.401733] nova-compute[62208]: WARNING nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] While synchronizing instance power states, found 10 instances in the database and 9 instances on the hypervisor. [ 1799.401942] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid bbb642f2-aa5c-4e71-b25b-e32acd45f879 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1799.402227] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid ec31fb88-38c6-400d-b1ec-c93af711a1f6 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1799.402513] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 47cd2de6-8094-452e-afd7-aa42128a1b0c {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1799.403068] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 7311ba0c-9a1b-4482-a4eb-6afe993e6656 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1799.403367] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid ad00920b-3783-4c01-bb25-4f923d29dad7 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1799.403578] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid ca1b4fca-a4bb-4a37-8e88-45e103a3579f {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1799.403806] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 68b1024d-2bfd-4999-9ba2-f2558c223885 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1799.404338] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1799.404538] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1799.404723] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 1799.406010] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.406298] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.406552] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "47cd2de6-8094-452e-afd7-aa42128a1b0c" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.406761] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.407052] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "ad00920b-3783-4c01-bb25-4f923d29dad7" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.407355] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.407571] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "68b1024d-2bfd-4999-9ba2-f2558c223885" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.407770] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.407968] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.408230] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.412940] nova-compute[62208]: DEBUG nova.policy [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e8cb00a6413b46fcb17cbe532a0bffc5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '53b578fa6aa34a2d80eb9938d58ffe12', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1799.446229] nova-compute[62208]: DEBUG nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1799.467464] nova-compute[62208]: DEBUG nova.virt.hardware [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1799.467714] nova-compute[62208]: DEBUG nova.virt.hardware [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1799.467871] nova-compute[62208]: DEBUG nova.virt.hardware [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1799.468146] nova-compute[62208]: DEBUG nova.virt.hardware [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1799.468328] nova-compute[62208]: DEBUG nova.virt.hardware [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1799.468780] nova-compute[62208]: DEBUG nova.virt.hardware [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1799.468780] nova-compute[62208]: DEBUG nova.virt.hardware [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1799.468913] nova-compute[62208]: DEBUG nova.virt.hardware [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1799.469057] nova-compute[62208]: DEBUG nova.virt.hardware [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1799.469269] nova-compute[62208]: DEBUG nova.virt.hardware [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1799.469449] nova-compute[62208]: DEBUG nova.virt.hardware [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1799.470296] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db95d002-e41a-42f3-b845-16386ffab6cf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.474558] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1799.474558] nova-compute[62208]: warnings.warn( [ 1799.480229] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acf8bd4d-ffa9-4b11-92d6-f29c81ec695a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.484088] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1799.484088] nova-compute[62208]: warnings.warn( [ 1799.772633] nova-compute[62208]: DEBUG nova.network.neutron [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Successfully created port: b6dd0106-ce5f-4edf-b41d-0d83b8ed4527 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1800.452986] nova-compute[62208]: DEBUG nova.network.neutron [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Successfully updated port: b6dd0106-ce5f-4edf-b41d-0d83b8ed4527 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1800.464388] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "refresh_cache-3ab2890c-e3d2-43e8-bab4-e3ba689a0529" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1800.464568] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "refresh_cache-3ab2890c-e3d2-43e8-bab4-e3ba689a0529" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1800.464693] nova-compute[62208]: DEBUG nova.network.neutron [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1800.516517] nova-compute[62208]: DEBUG nova.network.neutron [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1800.684162] nova-compute[62208]: DEBUG nova.network.neutron [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Updating instance_info_cache with network_info: [{"id": "b6dd0106-ce5f-4edf-b41d-0d83b8ed4527", "address": "fa:16:3e:ce:47:f4", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb6dd0106-ce", "ovs_interfaceid": "b6dd0106-ce5f-4edf-b41d-0d83b8ed4527", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1800.701265] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "refresh_cache-3ab2890c-e3d2-43e8-bab4-e3ba689a0529" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1800.701265] nova-compute[62208]: DEBUG nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Instance network_info: |[{"id": "b6dd0106-ce5f-4edf-b41d-0d83b8ed4527", "address": "fa:16:3e:ce:47:f4", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb6dd0106-ce", "ovs_interfaceid": "b6dd0106-ce5f-4edf-b41d-0d83b8ed4527", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1800.701736] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ce:47:f4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4b5f9472-1844-4c99-8804-8f193cfff562', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b6dd0106-ce5f-4edf-b41d-0d83b8ed4527', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1800.711187] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1800.711730] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1800.749760] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c523f9ae-f38d-4213-9b1f-fdba4c01df38 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1800.749760] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1800.749760] nova-compute[62208]: warnings.warn( [ 1800.749760] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1800.749760] nova-compute[62208]: value = "task-38636" [ 1800.749760] nova-compute[62208]: _type = "Task" [ 1800.749760] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1800.749760] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1800.749760] nova-compute[62208]: warnings.warn( [ 1800.749760] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38636, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1800.836385] nova-compute[62208]: DEBUG nova.compute.manager [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Received event network-vif-plugged-b6dd0106-ce5f-4edf-b41d-0d83b8ed4527 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1800.836629] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] Acquiring lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1800.836850] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] Lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1800.837181] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] Lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1800.837362] nova-compute[62208]: DEBUG nova.compute.manager [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] No waiting events found dispatching network-vif-plugged-b6dd0106-ce5f-4edf-b41d-0d83b8ed4527 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1800.837533] nova-compute[62208]: WARNING nova.compute.manager [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Received unexpected event network-vif-plugged-b6dd0106-ce5f-4edf-b41d-0d83b8ed4527 for instance with vm_state building and task_state spawning. [ 1800.837700] nova-compute[62208]: DEBUG nova.compute.manager [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Received event network-changed-b6dd0106-ce5f-4edf-b41d-0d83b8ed4527 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1800.837855] nova-compute[62208]: DEBUG nova.compute.manager [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Refreshing instance network info cache due to event network-changed-b6dd0106-ce5f-4edf-b41d-0d83b8ed4527. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1800.838128] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] Acquiring lock "refresh_cache-3ab2890c-e3d2-43e8-bab4-e3ba689a0529" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1800.838315] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] Acquired lock "refresh_cache-3ab2890c-e3d2-43e8-bab4-e3ba689a0529" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1800.838413] nova-compute[62208]: DEBUG nova.network.neutron [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Refreshing network info cache for port b6dd0106-ce5f-4edf-b41d-0d83b8ed4527 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1801.241986] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1801.241986] nova-compute[62208]: warnings.warn( [ 1801.247598] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38636, 'name': CreateVM_Task, 'duration_secs': 0.312197} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1801.247777] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1801.248433] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1801.249611] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1801.252570] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17e719a4-6cbc-48fa-80fc-63c1a16ed66b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.267043] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1801.267043] nova-compute[62208]: warnings.warn( [ 1801.294472] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Reconfiguring VM instance to enable vnc on port - 5907 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1801.294832] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-19e063d4-3509-468d-bbf6-309319445e07 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.309508] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1801.309508] nova-compute[62208]: warnings.warn( [ 1801.315273] nova-compute[62208]: DEBUG oslo_vmware.api [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 1801.315273] nova-compute[62208]: value = "task-38637" [ 1801.315273] nova-compute[62208]: _type = "Task" [ 1801.315273] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1801.318221] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1801.318221] nova-compute[62208]: warnings.warn( [ 1801.325177] nova-compute[62208]: DEBUG oslo_vmware.api [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38637, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1801.468728] nova-compute[62208]: DEBUG nova.network.neutron [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Updated VIF entry in instance network info cache for port b6dd0106-ce5f-4edf-b41d-0d83b8ed4527. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1801.469114] nova-compute[62208]: DEBUG nova.network.neutron [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Updating instance_info_cache with network_info: [{"id": "b6dd0106-ce5f-4edf-b41d-0d83b8ed4527", "address": "fa:16:3e:ce:47:f4", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb6dd0106-ce", "ovs_interfaceid": "b6dd0106-ce5f-4edf-b41d-0d83b8ed4527", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1801.484639] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b82cfd51-2709-404e-a071-4b925890dbe3 req-202055b3-3c57-4b78-988f-d88293d3e15b service nova] Releasing lock "refresh_cache-3ab2890c-e3d2-43e8-bab4-e3ba689a0529" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1801.819620] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1801.819620] nova-compute[62208]: warnings.warn( [ 1801.825291] nova-compute[62208]: DEBUG oslo_vmware.api [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38637, 'name': ReconfigVM_Task, 'duration_secs': 0.110597} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1801.825571] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Reconfigured VM instance to enable vnc on port - 5907 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1801.825781] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.577s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1801.826027] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1801.826172] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1801.826471] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1801.826726] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e5811501-7f83-45c1-a05e-fe770828a6ad {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.832828] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1801.832828] nova-compute[62208]: warnings.warn( [ 1801.834501] nova-compute[62208]: DEBUG oslo_vmware.api [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 1801.834501] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52746c15-cb1e-5e88-1f59-8ca22aaa8f62" [ 1801.834501] nova-compute[62208]: _type = "Task" [ 1801.834501] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1801.840836] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1801.840836] nova-compute[62208]: warnings.warn( [ 1801.846681] nova-compute[62208]: DEBUG oslo_vmware.api [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52746c15-cb1e-5e88-1f59-8ca22aaa8f62, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1802.338169] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1802.338169] nova-compute[62208]: warnings.warn( [ 1802.344670] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1802.344936] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1802.345324] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1803.052178] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-67de301c-4471-4723-af48-e7273a8741f4 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.615014] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-53e55a45-e9d1-4b8b-b0ca-16c2f9c58f39 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1807.222487] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "911fbbcc-69d5-479f-87f1-2561fcb3dd6b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1807.222789] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "911fbbcc-69d5-479f-87f1-2561fcb3dd6b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1845.189792] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1845.634520] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1845.634520] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1845.634520] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1845.634520] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1845.634520] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1845.634520] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1845.634520] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1845.634520] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1845.634520] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1845.634520] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1845.634520] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1845.634520] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1845.634989] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/be84bf86-0799-4168-922d-36b54b210143/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1845.636893] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1845.637135] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Copying Virtual Disk [datastore2] vmware_temp/be84bf86-0799-4168-922d-36b54b210143/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/be84bf86-0799-4168-922d-36b54b210143/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1845.637429] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6719119b-63a9-4401-bd2f-9a32582887f1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.639952] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1845.639952] nova-compute[62208]: warnings.warn( [ 1845.645603] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 1845.645603] nova-compute[62208]: value = "task-38638" [ 1845.645603] nova-compute[62208]: _type = "Task" [ 1845.645603] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1845.649323] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1845.649323] nova-compute[62208]: warnings.warn( [ 1845.654570] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38638, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1846.150574] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.150574] nova-compute[62208]: warnings.warn( [ 1846.156411] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1846.156700] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1846.157254] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Traceback (most recent call last): [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] yield resources [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] self.driver.spawn(context, instance, image_meta, [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] self._fetch_image_if_missing(context, vi) [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] image_cache(vi, tmp_image_ds_loc) [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] vm_util.copy_virtual_disk( [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] session._wait_for_task(vmdk_copy_task) [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] return self.wait_for_task(task_ref) [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] return evt.wait() [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] result = hub.switch() [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] return self.greenlet.switch() [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] self.f(*self.args, **self.kw) [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] raise exceptions.translate_fault(task_info.error) [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Faults: ['InvalidArgument'] [ 1846.157254] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] [ 1846.157981] nova-compute[62208]: INFO nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Terminating instance [ 1846.159185] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1846.159383] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1846.159623] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-97cdfe1b-d9d0-414a-895f-5c687461d5bb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.161986] nova-compute[62208]: DEBUG nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1846.162182] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1846.162900] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6867a6bf-8484-41d4-9042-aa085bcec011 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.165247] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.165247] nova-compute[62208]: warnings.warn( [ 1846.165583] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.165583] nova-compute[62208]: warnings.warn( [ 1846.170008] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1846.170237] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-116d3d43-6b2c-495c-8acc-1a2b5c7bfe13 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.172450] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1846.172623] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1846.173187] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.173187] nova-compute[62208]: warnings.warn( [ 1846.173580] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2a1acae6-4620-4212-bc26-c65ce809c505 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.175673] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.175673] nova-compute[62208]: warnings.warn( [ 1846.178638] nova-compute[62208]: DEBUG oslo_vmware.api [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for the task: (returnval){ [ 1846.178638] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5206d42b-dbc2-2c92-666a-409046a0170a" [ 1846.178638] nova-compute[62208]: _type = "Task" [ 1846.178638] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1846.181402] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.181402] nova-compute[62208]: warnings.warn( [ 1846.194075] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1846.194375] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Creating directory with path [datastore2] vmware_temp/17a3a3c3-1d42-490a-b07f-21f9279a51cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1846.194559] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-555eb088-19b2-4f05-b966-096816fdff22 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.196501] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.196501] nova-compute[62208]: warnings.warn( [ 1846.215895] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Created directory with path [datastore2] vmware_temp/17a3a3c3-1d42-490a-b07f-21f9279a51cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1846.216183] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Fetch image to [datastore2] vmware_temp/17a3a3c3-1d42-490a-b07f-21f9279a51cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1846.216362] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/17a3a3c3-1d42-490a-b07f-21f9279a51cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1846.217219] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abd50987-4428-4acb-b67e-175c55443239 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.219678] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.219678] nova-compute[62208]: warnings.warn( [ 1846.224572] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6e740e9-db1b-435f-b66e-6495379afb19 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.226873] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.226873] nova-compute[62208]: warnings.warn( [ 1846.234497] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa7bf0c6-d61c-44d7-ad86-2c0f30b45354 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.239444] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1846.239642] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1846.239819] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleting the datastore file [datastore2] bbb642f2-aa5c-4e71-b25b-e32acd45f879 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1846.239995] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.239995] nova-compute[62208]: warnings.warn( [ 1846.240501] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7ac07780-7fc7-41ee-a6ff-414df861aca5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.242101] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.242101] nova-compute[62208]: warnings.warn( [ 1846.268924] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19bdd7eb-60c2-440f-b9e7-342fdf256b89 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.271545] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 1846.271545] nova-compute[62208]: value = "task-38640" [ 1846.271545] nova-compute[62208]: _type = "Task" [ 1846.271545] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1846.271734] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.271734] nova-compute[62208]: warnings.warn( [ 1846.276239] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.276239] nova-compute[62208]: warnings.warn( [ 1846.276686] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b96ca23f-3e17-4d37-a929-ba3947adf407 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.280828] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38640, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1846.280941] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.280941] nova-compute[62208]: warnings.warn( [ 1846.302430] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1846.351583] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/17a3a3c3-1d42-490a-b07f-21f9279a51cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1846.410968] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1846.411227] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/17a3a3c3-1d42-490a-b07f-21f9279a51cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1846.776234] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.776234] nova-compute[62208]: warnings.warn( [ 1846.782275] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38640, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06371} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1846.782547] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1846.782732] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1846.782907] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1846.783084] nova-compute[62208]: INFO nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1846.785131] nova-compute[62208]: DEBUG nova.compute.claims [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936e2abf0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1846.785368] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1846.785533] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1846.971651] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93cf962f-b4c4-4ac6-8d1a-86cb6bec9832 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.974384] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.974384] nova-compute[62208]: warnings.warn( [ 1846.979673] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-721fdf90-36c7-4f1c-8c91-41da9a84ee66 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.982915] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1846.982915] nova-compute[62208]: warnings.warn( [ 1847.009526] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-479d345f-b0ca-4ee4-844f-62001ddeaa2c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.011949] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1847.011949] nova-compute[62208]: warnings.warn( [ 1847.017163] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27ac1aa7-4366-4f13-be1d-bf674114ed14 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.020927] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1847.020927] nova-compute[62208]: warnings.warn( [ 1847.030800] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1847.041130] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1847.060745] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.275s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1847.061312] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Traceback (most recent call last): [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] self.driver.spawn(context, instance, image_meta, [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] self._fetch_image_if_missing(context, vi) [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] image_cache(vi, tmp_image_ds_loc) [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] vm_util.copy_virtual_disk( [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] session._wait_for_task(vmdk_copy_task) [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] return self.wait_for_task(task_ref) [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] return evt.wait() [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] result = hub.switch() [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] return self.greenlet.switch() [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] self.f(*self.args, **self.kw) [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] raise exceptions.translate_fault(task_info.error) [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Faults: ['InvalidArgument'] [ 1847.061312] nova-compute[62208]: ERROR nova.compute.manager [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] [ 1847.062540] nova-compute[62208]: DEBUG nova.compute.utils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1847.063491] nova-compute[62208]: DEBUG nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Build of instance bbb642f2-aa5c-4e71-b25b-e32acd45f879 was re-scheduled: A specified parameter was not correct: fileType [ 1847.063491] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1847.063860] nova-compute[62208]: DEBUG nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1847.064057] nova-compute[62208]: DEBUG nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1847.064238] nova-compute[62208]: DEBUG nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1847.064416] nova-compute[62208]: DEBUG nova.network.neutron [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1847.141142] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1847.304047] nova-compute[62208]: DEBUG nova.network.neutron [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1847.316677] nova-compute[62208]: INFO nova.compute.manager [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Took 0.25 seconds to deallocate network for instance. [ 1847.417945] nova-compute[62208]: INFO nova.scheduler.client.report [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleted allocations for instance bbb642f2-aa5c-4e71-b25b-e32acd45f879 [ 1847.446331] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ca882aac-9d98-485b-bc8d-4d0b65396ae6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 632.298s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1847.447620] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f572a83e-f23b-4048-8b28-60c604e2b03e tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 435.902s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1847.447840] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f572a83e-f23b-4048-8b28-60c604e2b03e tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1847.448120] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f572a83e-f23b-4048-8b28-60c604e2b03e tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1847.448315] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f572a83e-f23b-4048-8b28-60c604e2b03e tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1847.450763] nova-compute[62208]: INFO nova.compute.manager [None req-f572a83e-f23b-4048-8b28-60c604e2b03e tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Terminating instance [ 1847.452712] nova-compute[62208]: DEBUG nova.compute.manager [None req-f572a83e-f23b-4048-8b28-60c604e2b03e tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1847.452897] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f572a83e-f23b-4048-8b28-60c604e2b03e tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1847.453374] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-26ea734a-b780-47a2-86b7-3685d48ca0a8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.455973] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1847.455973] nova-compute[62208]: warnings.warn( [ 1847.463691] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f253d18-8c89-4ea0-9c1f-aaaae74ee05a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.475818] nova-compute[62208]: DEBUG nova.compute.manager [None req-a334d70d-723c-437b-8e0c-12a59c7d6579 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 638c73fd-6c45-4f50-8078-2a61f3339ad2] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1847.478091] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1847.478091] nova-compute[62208]: warnings.warn( [ 1847.496753] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-f572a83e-f23b-4048-8b28-60c604e2b03e tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bbb642f2-aa5c-4e71-b25b-e32acd45f879 could not be found. [ 1847.496955] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-f572a83e-f23b-4048-8b28-60c604e2b03e tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1847.497130] nova-compute[62208]: INFO nova.compute.manager [None req-f572a83e-f23b-4048-8b28-60c604e2b03e tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1847.497384] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-f572a83e-f23b-4048-8b28-60c604e2b03e tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1847.497630] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1847.497723] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1847.504431] nova-compute[62208]: DEBUG nova.compute.manager [None req-a334d70d-723c-437b-8e0c-12a59c7d6579 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 638c73fd-6c45-4f50-8078-2a61f3339ad2] Instance disappeared before build. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2430}} [ 1847.537706] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a334d70d-723c-437b-8e0c-12a59c7d6579 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "638c73fd-6c45-4f50-8078-2a61f3339ad2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 237.439s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1847.540269] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1847.549162] nova-compute[62208]: INFO nova.compute.manager [-] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] Took 0.05 seconds to deallocate network for instance. [ 1847.549509] nova-compute[62208]: DEBUG nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1847.603927] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1847.604190] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1847.605635] nova-compute[62208]: INFO nova.compute.claims [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1847.687120] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-f572a83e-f23b-4048-8b28-60c604e2b03e tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.239s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1847.687962] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 48.282s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1847.688227] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: bbb642f2-aa5c-4e71-b25b-e32acd45f879] During sync_power_state the instance has a pending task (deleting). Skip. [ 1847.688406] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "bbb642f2-aa5c-4e71-b25b-e32acd45f879" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1847.818392] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0a3d2b3-8bb8-4a36-9d25-edb5a141a3ba {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.821356] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1847.821356] nova-compute[62208]: warnings.warn( [ 1847.826610] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-823b753e-df38-4b19-8b44-a9b1d2aef35f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.829712] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1847.829712] nova-compute[62208]: warnings.warn( [ 1847.856648] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2019ded3-6d14-4057-bfb8-ff48ccc5d4cb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.859200] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1847.859200] nova-compute[62208]: warnings.warn( [ 1847.864712] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7446b1e0-c140-42d2-8650-bf16f8be51e8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.868369] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1847.868369] nova-compute[62208]: warnings.warn( [ 1847.878171] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1847.886552] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1847.904181] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1847.904785] nova-compute[62208]: DEBUG nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1847.942667] nova-compute[62208]: DEBUG nova.compute.utils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1847.943932] nova-compute[62208]: DEBUG nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1847.944115] nova-compute[62208]: DEBUG nova.network.neutron [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1847.954643] nova-compute[62208]: DEBUG nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1847.990000] nova-compute[62208]: DEBUG nova.policy [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bd5497a94524d2d97f74b1fbaedd7f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a9b50d4a3e0c43d491d13e85d9a2bb8a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1848.032262] nova-compute[62208]: DEBUG nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1848.055806] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1848.056097] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1848.056277] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1848.056461] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1848.056602] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1848.056812] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1848.057087] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1848.057262] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1848.057456] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1848.057644] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1848.057819] nova-compute[62208]: DEBUG nova.virt.hardware [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1848.058736] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd60a95c-4ec9-442d-8c77-5f83c4222cf4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.061334] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1848.061334] nova-compute[62208]: warnings.warn( [ 1848.069323] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44e7d4f3-72c4-400a-aed3-050405229955 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.073175] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1848.073175] nova-compute[62208]: warnings.warn( [ 1848.381763] nova-compute[62208]: DEBUG nova.network.neutron [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Successfully created port: 341691f6-6d0f-417c-805a-c5465b2c6f84 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1849.067829] nova-compute[62208]: DEBUG nova.network.neutron [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Successfully updated port: 341691f6-6d0f-417c-805a-c5465b2c6f84 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1849.078603] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "refresh_cache-e0a444fc-dca2-419a-9ac1-8d71048e1690" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1849.078756] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquired lock "refresh_cache-e0a444fc-dca2-419a-9ac1-8d71048e1690" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1849.078913] nova-compute[62208]: DEBUG nova.network.neutron [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1849.117784] nova-compute[62208]: DEBUG nova.network.neutron [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1849.275175] nova-compute[62208]: DEBUG nova.network.neutron [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Updating instance_info_cache with network_info: [{"id": "341691f6-6d0f-417c-805a-c5465b2c6f84", "address": "fa:16:3e:1a:6e:c7", "network": {"id": "36e2d180-28fc-4e4c-90c8-e71996738298", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1469032612-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a9b50d4a3e0c43d491d13e85d9a2bb8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "459b8c74-0aa6-42b6-996a-42b1c5d7e5c6", "external-id": "nsx-vlan-transportzone-467", "segmentation_id": 467, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap341691f6-6d", "ovs_interfaceid": "341691f6-6d0f-417c-805a-c5465b2c6f84", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1849.290099] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Releasing lock "refresh_cache-e0a444fc-dca2-419a-9ac1-8d71048e1690" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1849.290467] nova-compute[62208]: DEBUG nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Instance network_info: |[{"id": "341691f6-6d0f-417c-805a-c5465b2c6f84", "address": "fa:16:3e:1a:6e:c7", "network": {"id": "36e2d180-28fc-4e4c-90c8-e71996738298", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1469032612-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a9b50d4a3e0c43d491d13e85d9a2bb8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "459b8c74-0aa6-42b6-996a-42b1c5d7e5c6", "external-id": "nsx-vlan-transportzone-467", "segmentation_id": 467, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap341691f6-6d", "ovs_interfaceid": "341691f6-6d0f-417c-805a-c5465b2c6f84", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1849.290907] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1a:6e:c7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '459b8c74-0aa6-42b6-996a-42b1c5d7e5c6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '341691f6-6d0f-417c-805a-c5465b2c6f84', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1849.298335] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1849.298864] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1849.299141] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b8ebb932-1ceb-436d-b69c-957042412d8a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1849.313387] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1849.313387] nova-compute[62208]: warnings.warn( [ 1849.319913] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1849.319913] nova-compute[62208]: value = "task-38641" [ 1849.319913] nova-compute[62208]: _type = "Task" [ 1849.319913] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1849.323161] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1849.323161] nova-compute[62208]: warnings.warn( [ 1849.328857] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38641, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1849.338238] nova-compute[62208]: DEBUG nova.compute.manager [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Received event network-vif-plugged-341691f6-6d0f-417c-805a-c5465b2c6f84 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1849.338508] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] Acquiring lock "e0a444fc-dca2-419a-9ac1-8d71048e1690-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1849.338661] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] Lock "e0a444fc-dca2-419a-9ac1-8d71048e1690-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1849.338815] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] Lock "e0a444fc-dca2-419a-9ac1-8d71048e1690-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1849.338974] nova-compute[62208]: DEBUG nova.compute.manager [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] No waiting events found dispatching network-vif-plugged-341691f6-6d0f-417c-805a-c5465b2c6f84 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1849.339169] nova-compute[62208]: WARNING nova.compute.manager [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Received unexpected event network-vif-plugged-341691f6-6d0f-417c-805a-c5465b2c6f84 for instance with vm_state building and task_state spawning. [ 1849.339331] nova-compute[62208]: DEBUG nova.compute.manager [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Received event network-changed-341691f6-6d0f-417c-805a-c5465b2c6f84 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1849.339484] nova-compute[62208]: DEBUG nova.compute.manager [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Refreshing instance network info cache due to event network-changed-341691f6-6d0f-417c-805a-c5465b2c6f84. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1849.339665] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] Acquiring lock "refresh_cache-e0a444fc-dca2-419a-9ac1-8d71048e1690" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1849.339801] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] Acquired lock "refresh_cache-e0a444fc-dca2-419a-9ac1-8d71048e1690" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1849.339956] nova-compute[62208]: DEBUG nova.network.neutron [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Refreshing network info cache for port 341691f6-6d0f-417c-805a-c5465b2c6f84 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1849.586283] nova-compute[62208]: DEBUG nova.network.neutron [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Updated VIF entry in instance network info cache for port 341691f6-6d0f-417c-805a-c5465b2c6f84. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1849.586761] nova-compute[62208]: DEBUG nova.network.neutron [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Updating instance_info_cache with network_info: [{"id": "341691f6-6d0f-417c-805a-c5465b2c6f84", "address": "fa:16:3e:1a:6e:c7", "network": {"id": "36e2d180-28fc-4e4c-90c8-e71996738298", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1469032612-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a9b50d4a3e0c43d491d13e85d9a2bb8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "459b8c74-0aa6-42b6-996a-42b1c5d7e5c6", "external-id": "nsx-vlan-transportzone-467", "segmentation_id": 467, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap341691f6-6d", "ovs_interfaceid": "341691f6-6d0f-417c-805a-c5465b2c6f84", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1849.597225] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2601f24f-f698-41b7-af88-39f9b230c365 req-913fe649-26e0-45c5-a135-58b357a79d62 service nova] Releasing lock "refresh_cache-e0a444fc-dca2-419a-9ac1-8d71048e1690" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1849.824607] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1849.824607] nova-compute[62208]: warnings.warn( [ 1849.830603] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38641, 'name': CreateVM_Task, 'duration_secs': 0.313956} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1849.830773] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1849.838370] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1849.838629] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1849.841661] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa72b8ea-cfbb-409c-873a-e1a025a6b5a0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1849.852481] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1849.852481] nova-compute[62208]: warnings.warn( [ 1849.877039] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Reconfiguring VM instance to enable vnc on port - 5908 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1849.877556] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-54994f85-35fb-4f19-af6d-104f372e30f9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1849.887874] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1849.887874] nova-compute[62208]: warnings.warn( [ 1849.893355] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 1849.893355] nova-compute[62208]: value = "task-38642" [ 1849.893355] nova-compute[62208]: _type = "Task" [ 1849.893355] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1849.896364] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1849.896364] nova-compute[62208]: warnings.warn( [ 1849.904652] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38642, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1850.398247] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1850.398247] nova-compute[62208]: warnings.warn( [ 1850.404055] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38642, 'name': ReconfigVM_Task, 'duration_secs': 0.111333} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1850.404361] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Reconfigured VM instance to enable vnc on port - 5908 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1850.404581] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.566s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1850.404829] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1850.404975] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1850.405318] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1850.405581] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ec346324-7f93-4557-8ad4-bd06f7187e18 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1850.407201] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1850.407201] nova-compute[62208]: warnings.warn( [ 1850.410413] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 1850.410413] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]523edbf2-70c5-2ffb-3bbb-e4262a99b078" [ 1850.410413] nova-compute[62208]: _type = "Task" [ 1850.410413] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1850.413470] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1850.413470] nova-compute[62208]: warnings.warn( [ 1850.418152] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]523edbf2-70c5-2ffb-3bbb-e4262a99b078, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1850.914452] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1850.914452] nova-compute[62208]: warnings.warn( [ 1850.921181] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1850.921549] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1850.921810] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1851.140707] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1851.151453] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1851.151815] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1851.152059] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1851.152303] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1851.153513] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-090658fb-5909-4b6f-bf9c-6d29dac5947b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.156380] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1851.156380] nova-compute[62208]: warnings.warn( [ 1851.163333] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdfd2ce6-ea06-45d8-b476-f85547e36f6e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.167478] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1851.167478] nova-compute[62208]: warnings.warn( [ 1851.179810] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e377124d-ec5b-44db-8ef4-f745220f6fc9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.182517] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1851.182517] nova-compute[62208]: warnings.warn( [ 1851.187295] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41a23e87-b13e-4942-b793-a969d40f8a2a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.190518] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1851.190518] nova-compute[62208]: warnings.warn( [ 1851.216701] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181961MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1851.216863] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1851.217067] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1851.289798] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1851.290045] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 47cd2de6-8094-452e-afd7-aa42128a1b0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1851.290201] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1851.290374] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ad00920b-3783-4c01-bb25-4f923d29dad7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1851.290447] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ca1b4fca-a4bb-4a37-8e88-45e103a3579f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1851.290572] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 68b1024d-2bfd-4999-9ba2-f2558c223885 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1851.290692] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1851.290812] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1851.290931] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1851.291051] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e0a444fc-dca2-419a-9ac1-8d71048e1690 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1851.303312] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 911fbbcc-69d5-479f-87f1-2561fcb3dd6b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1851.303556] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1851.303703] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1851.470612] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5732e265-85b4-430b-99a2-20839032fa17 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.473316] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1851.473316] nova-compute[62208]: warnings.warn( [ 1851.478716] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e414047-8f8f-425c-83c6-ed9f09ed5d93 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.481737] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1851.481737] nova-compute[62208]: warnings.warn( [ 1851.509524] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b39ec27-2654-4559-b5a7-0e3bb1a259b3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.512157] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1851.512157] nova-compute[62208]: warnings.warn( [ 1851.517681] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3aaee23e-c07e-4777-a2e2-169f275182ab {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.521415] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1851.521415] nova-compute[62208]: warnings.warn( [ 1851.530884] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1851.541586] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1851.557690] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1851.558066] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.341s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1852.553780] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1852.554066] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1852.554298] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1852.554486] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1852.576600] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1852.576695] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1852.576833] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1852.576966] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1852.577093] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1852.577216] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1852.577336] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1852.577453] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1852.577572] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1852.577689] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1852.577807] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1852.578344] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1853.141004] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1853.908938] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "5b797610-f460-461c-8c5a-1a28cf162c0e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1853.909287] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "5b797610-f460-461c-8c5a-1a28cf162c0e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1854.140649] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1856.141403] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1856.141832] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1860.137072] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1894.924291] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1894.924291] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1894.924291] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1894.924291] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1894.924291] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1894.924291] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1894.924291] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1894.924291] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1894.924291] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1894.924291] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1894.924291] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1894.924291] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1894.924959] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/17a3a3c3-1d42-490a-b07f-21f9279a51cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1894.926499] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1894.926725] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Copying Virtual Disk [datastore2] vmware_temp/17a3a3c3-1d42-490a-b07f-21f9279a51cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/17a3a3c3-1d42-490a-b07f-21f9279a51cd/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1894.927016] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-21712e3c-0148-43e4-903c-76e04a5fd571 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1894.929644] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1894.929644] nova-compute[62208]: warnings.warn( [ 1894.936542] nova-compute[62208]: DEBUG oslo_vmware.api [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for the task: (returnval){ [ 1894.936542] nova-compute[62208]: value = "task-38643" [ 1894.936542] nova-compute[62208]: _type = "Task" [ 1894.936542] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1894.939681] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1894.939681] nova-compute[62208]: warnings.warn( [ 1894.944865] nova-compute[62208]: DEBUG oslo_vmware.api [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Task: {'id': task-38643, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1895.441211] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1895.441211] nova-compute[62208]: warnings.warn( [ 1895.446551] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1895.446824] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1895.447370] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Traceback (most recent call last): [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] yield resources [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] self.driver.spawn(context, instance, image_meta, [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] self._fetch_image_if_missing(context, vi) [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] image_cache(vi, tmp_image_ds_loc) [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] vm_util.copy_virtual_disk( [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] session._wait_for_task(vmdk_copy_task) [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] return self.wait_for_task(task_ref) [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] return evt.wait() [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] result = hub.switch() [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] return self.greenlet.switch() [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] self.f(*self.args, **self.kw) [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] raise exceptions.translate_fault(task_info.error) [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Faults: ['InvalidArgument'] [ 1895.447370] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] [ 1895.448205] nova-compute[62208]: INFO nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Terminating instance [ 1895.449283] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1895.449492] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1895.449730] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2bab302f-5471-489b-bdb2-0b4d63c1c7fc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.452147] nova-compute[62208]: DEBUG nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1895.452333] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1895.453060] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-225552eb-b003-4f27-a814-1094f5053dc3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.455326] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1895.455326] nova-compute[62208]: warnings.warn( [ 1895.455681] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1895.455681] nova-compute[62208]: warnings.warn( [ 1895.459907] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1895.460140] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-11d53785-88c0-4400-a62d-91aa67c382a1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.462844] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1895.463009] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1895.463732] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1895.463732] nova-compute[62208]: warnings.warn( [ 1895.464157] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d5eb3b6b-e0d5-4ae0-83ad-68db1c338b37 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.466304] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1895.466304] nova-compute[62208]: warnings.warn( [ 1895.469665] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Waiting for the task: (returnval){ [ 1895.469665] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52311d97-02a3-52cf-17d6-b5e125509275" [ 1895.469665] nova-compute[62208]: _type = "Task" [ 1895.469665] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1895.472529] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1895.472529] nova-compute[62208]: warnings.warn( [ 1895.483461] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52311d97-02a3-52cf-17d6-b5e125509275, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1895.588522] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1895.588912] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1895.589242] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Deleting the datastore file [datastore2] ec31fb88-38c6-400d-b1ec-c93af711a1f6 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1895.589637] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9127b88b-7aba-4968-a272-3ea6a170f5d6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.591989] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1895.591989] nova-compute[62208]: warnings.warn( [ 1895.596987] nova-compute[62208]: DEBUG oslo_vmware.api [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for the task: (returnval){ [ 1895.596987] nova-compute[62208]: value = "task-38645" [ 1895.596987] nova-compute[62208]: _type = "Task" [ 1895.596987] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1895.600410] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1895.600410] nova-compute[62208]: warnings.warn( [ 1895.605346] nova-compute[62208]: DEBUG oslo_vmware.api [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Task: {'id': task-38645, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1895.974271] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1895.974271] nova-compute[62208]: warnings.warn( [ 1895.980530] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1895.980804] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Creating directory with path [datastore2] vmware_temp/449aee94-4295-4379-a5dd-0d54fff20c54/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1895.981047] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4e0ed02c-ab75-4d04-9dc3-9a4fe1b37110 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.982706] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1895.982706] nova-compute[62208]: warnings.warn( [ 1895.992395] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Created directory with path [datastore2] vmware_temp/449aee94-4295-4379-a5dd-0d54fff20c54/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1895.992623] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Fetch image to [datastore2] vmware_temp/449aee94-4295-4379-a5dd-0d54fff20c54/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1895.992802] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/449aee94-4295-4379-a5dd-0d54fff20c54/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1895.993509] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d2e786e-0497-4129-8c64-d54c1d858886 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.995730] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1895.995730] nova-compute[62208]: warnings.warn( [ 1896.000140] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae343345-da65-4d96-af1b-10948f14b256 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.002219] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1896.002219] nova-compute[62208]: warnings.warn( [ 1896.009442] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a71941a4-5965-42f3-9c64-8747c41c0661 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.012870] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1896.012870] nova-compute[62208]: warnings.warn( [ 1896.041717] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cbe0ffc-25f8-4af2-978e-be0b3cd035fd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.044301] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1896.044301] nova-compute[62208]: warnings.warn( [ 1896.048455] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b2606cac-6631-4b22-9b22-1f14429d646b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.050139] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1896.050139] nova-compute[62208]: warnings.warn( [ 1896.069028] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1896.101736] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1896.101736] nova-compute[62208]: warnings.warn( [ 1896.108353] nova-compute[62208]: DEBUG oslo_vmware.api [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Task: {'id': task-38645, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077026} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1896.108607] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1896.108795] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1896.108967] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1896.109153] nova-compute[62208]: INFO nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Took 0.66 seconds to destroy the instance on the hypervisor. [ 1896.111204] nova-compute[62208]: DEBUG nova.compute.claims [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Aborting claim: <nova.compute.claims.Claim object at 0x7fb937461ea0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1896.111383] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1896.111598] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1896.124428] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/449aee94-4295-4379-a5dd-0d54fff20c54/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1896.181968] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1896.182025] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/449aee94-4295-4379-a5dd-0d54fff20c54/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1896.323713] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-433b0885-dfd1-4e7d-b95f-00cdfd00d8ef {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.326715] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1896.326715] nova-compute[62208]: warnings.warn( [ 1896.332246] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14bb5511-ff0b-4748-b2d6-5383297b0126 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.335244] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1896.335244] nova-compute[62208]: warnings.warn( [ 1896.361720] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b3c4791-8507-43d2-b06f-f3c2744e2aed {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.363909] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1896.363909] nova-compute[62208]: warnings.warn( [ 1896.368877] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00e323ca-6252-43ec-87f5-729b210569ad {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.373439] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1896.373439] nova-compute[62208]: warnings.warn( [ 1896.382854] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1896.391700] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1896.408024] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.295s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1896.408024] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Traceback (most recent call last): [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] self.driver.spawn(context, instance, image_meta, [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] self._fetch_image_if_missing(context, vi) [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] image_cache(vi, tmp_image_ds_loc) [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] vm_util.copy_virtual_disk( [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] session._wait_for_task(vmdk_copy_task) [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] return self.wait_for_task(task_ref) [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] return evt.wait() [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] result = hub.switch() [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] return self.greenlet.switch() [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] self.f(*self.args, **self.kw) [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] raise exceptions.translate_fault(task_info.error) [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Faults: ['InvalidArgument'] [ 1896.408024] nova-compute[62208]: ERROR nova.compute.manager [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] [ 1896.408835] nova-compute[62208]: DEBUG nova.compute.utils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1896.409998] nova-compute[62208]: DEBUG nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Build of instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 was re-scheduled: A specified parameter was not correct: fileType [ 1896.409998] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1896.410374] nova-compute[62208]: DEBUG nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1896.410549] nova-compute[62208]: DEBUG nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1896.410717] nova-compute[62208]: DEBUG nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1896.410882] nova-compute[62208]: DEBUG nova.network.neutron [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1896.797883] nova-compute[62208]: DEBUG nova.network.neutron [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1896.812894] nova-compute[62208]: INFO nova.compute.manager [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Took 0.40 seconds to deallocate network for instance. [ 1896.908026] nova-compute[62208]: INFO nova.scheduler.client.report [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Deleted allocations for instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 [ 1896.926883] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-811958e5-978f-4d29-b780-3245490a954c tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 678.633s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1896.928082] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10dde76c-fe12-4a4d-af27-f16ee2d244e9 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 482.863s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1896.928347] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10dde76c-fe12-4a4d-af27-f16ee2d244e9 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1896.928561] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10dde76c-fe12-4a4d-af27-f16ee2d244e9 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1896.929342] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10dde76c-fe12-4a4d-af27-f16ee2d244e9 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1896.931355] nova-compute[62208]: INFO nova.compute.manager [None req-10dde76c-fe12-4a4d-af27-f16ee2d244e9 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Terminating instance [ 1896.933944] nova-compute[62208]: DEBUG nova.compute.manager [None req-10dde76c-fe12-4a4d-af27-f16ee2d244e9 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1896.934422] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-10dde76c-fe12-4a4d-af27-f16ee2d244e9 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1896.934704] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2302b6b3-42c8-4d39-a4f0-c6ac1cafcafc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.937194] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1896.937194] nova-compute[62208]: warnings.warn( [ 1896.944845] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ded03fbf-f86e-4b5f-8385-098a2c698c24 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.955903] nova-compute[62208]: DEBUG nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1896.958771] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1896.958771] nova-compute[62208]: warnings.warn( [ 1896.978553] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-10dde76c-fe12-4a4d-af27-f16ee2d244e9 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ec31fb88-38c6-400d-b1ec-c93af711a1f6 could not be found. [ 1896.978878] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-10dde76c-fe12-4a4d-af27-f16ee2d244e9 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1896.978926] nova-compute[62208]: INFO nova.compute.manager [None req-10dde76c-fe12-4a4d-af27-f16ee2d244e9 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1896.979246] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-10dde76c-fe12-4a4d-af27-f16ee2d244e9 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1896.979481] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1896.979612] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1897.011487] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1897.011762] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1897.013335] nova-compute[62208]: INFO nova.compute.claims [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1897.024989] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1897.047076] nova-compute[62208]: INFO nova.compute.manager [-] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] Took 0.07 seconds to deallocate network for instance. [ 1897.143496] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10dde76c-fe12-4a4d-af27-f16ee2d244e9 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.215s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1897.144493] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 97.738s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1897.144682] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec31fb88-38c6-400d-b1ec-c93af711a1f6] During sync_power_state the instance has a pending task (deleting). Skip. [ 1897.144903] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "ec31fb88-38c6-400d-b1ec-c93af711a1f6" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1897.209110] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38123f89-aefa-4755-b811-9cbed95b83bb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.211919] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1897.211919] nova-compute[62208]: warnings.warn( [ 1897.217664] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d3cfc47-0822-443a-8a7a-1e2ede4a353f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.220643] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1897.220643] nova-compute[62208]: warnings.warn( [ 1897.247518] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f695f695-4a47-4cd7-998d-bb99731822e3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.250123] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1897.250123] nova-compute[62208]: warnings.warn( [ 1897.255424] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76667d47-19a6-4124-8cbe-9a7d28a1ea85 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.259095] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1897.259095] nova-compute[62208]: warnings.warn( [ 1897.268607] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1897.277158] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1897.293460] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1897.293936] nova-compute[62208]: DEBUG nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1897.327450] nova-compute[62208]: DEBUG nova.compute.utils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1897.328963] nova-compute[62208]: DEBUG nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1897.329154] nova-compute[62208]: DEBUG nova.network.neutron [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1897.339078] nova-compute[62208]: DEBUG nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1897.374846] nova-compute[62208]: DEBUG nova.policy [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c80e7a39b71446eb86cc235973d9eb55', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7a607d976ab14539a9d204c9437c3522', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1897.406632] nova-compute[62208]: DEBUG nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1897.428125] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1897.428393] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1897.428554] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1897.428732] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1897.428877] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1897.429025] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1897.429238] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1897.429386] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1897.429551] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1897.429709] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1897.429876] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1897.430986] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee5a601c-21ba-468e-a45a-bd093cc8933d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.433427] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1897.433427] nova-compute[62208]: warnings.warn( [ 1897.438972] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b65f16df-ae84-4d85-b2c3-46dca8f5669d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.443000] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1897.443000] nova-compute[62208]: warnings.warn( [ 1897.633136] nova-compute[62208]: DEBUG nova.network.neutron [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Successfully created port: 208bbce8-6035-4f3a-8c36-828fb2e53fbf {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1898.303529] nova-compute[62208]: DEBUG nova.network.neutron [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Successfully updated port: 208bbce8-6035-4f3a-8c36-828fb2e53fbf {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1898.318773] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "refresh_cache-911fbbcc-69d5-479f-87f1-2561fcb3dd6b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1898.318924] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquired lock "refresh_cache-911fbbcc-69d5-479f-87f1-2561fcb3dd6b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1898.319079] nova-compute[62208]: DEBUG nova.network.neutron [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1898.358185] nova-compute[62208]: DEBUG nova.network.neutron [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1898.507850] nova-compute[62208]: DEBUG nova.network.neutron [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Updating instance_info_cache with network_info: [{"id": "208bbce8-6035-4f3a-8c36-828fb2e53fbf", "address": "fa:16:3e:89:49:72", "network": {"id": "4ff34335-1f93-40d4-a4ee-e581b57a773a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1917389442-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7a607d976ab14539a9d204c9437c3522", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f5fe645c-e088-401e-ab53-4ae2981dea72", "external-id": "nsx-vlan-transportzone-219", "segmentation_id": 219, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap208bbce8-60", "ovs_interfaceid": "208bbce8-6035-4f3a-8c36-828fb2e53fbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1898.521912] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Releasing lock "refresh_cache-911fbbcc-69d5-479f-87f1-2561fcb3dd6b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1898.522226] nova-compute[62208]: DEBUG nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Instance network_info: |[{"id": "208bbce8-6035-4f3a-8c36-828fb2e53fbf", "address": "fa:16:3e:89:49:72", "network": {"id": "4ff34335-1f93-40d4-a4ee-e581b57a773a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1917389442-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7a607d976ab14539a9d204c9437c3522", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f5fe645c-e088-401e-ab53-4ae2981dea72", "external-id": "nsx-vlan-transportzone-219", "segmentation_id": 219, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap208bbce8-60", "ovs_interfaceid": "208bbce8-6035-4f3a-8c36-828fb2e53fbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1898.522656] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:89:49:72', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f5fe645c-e088-401e-ab53-4ae2981dea72', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '208bbce8-6035-4f3a-8c36-828fb2e53fbf', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1898.530050] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1898.530579] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1898.530814] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8a57ce9a-a0de-42f0-a2e9-aa5ec2e0efc3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1898.545079] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1898.545079] nova-compute[62208]: warnings.warn( [ 1898.551720] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1898.551720] nova-compute[62208]: value = "task-38646" [ 1898.551720] nova-compute[62208]: _type = "Task" [ 1898.551720] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1898.554892] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1898.554892] nova-compute[62208]: warnings.warn( [ 1898.560270] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38646, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1898.874599] nova-compute[62208]: DEBUG nova.compute.manager [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Received event network-vif-plugged-208bbce8-6035-4f3a-8c36-828fb2e53fbf {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1898.874858] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] Acquiring lock "911fbbcc-69d5-479f-87f1-2561fcb3dd6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1898.875067] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] Lock "911fbbcc-69d5-479f-87f1-2561fcb3dd6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1898.875240] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] Lock "911fbbcc-69d5-479f-87f1-2561fcb3dd6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1898.875413] nova-compute[62208]: DEBUG nova.compute.manager [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] No waiting events found dispatching network-vif-plugged-208bbce8-6035-4f3a-8c36-828fb2e53fbf {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1898.875587] nova-compute[62208]: WARNING nova.compute.manager [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Received unexpected event network-vif-plugged-208bbce8-6035-4f3a-8c36-828fb2e53fbf for instance with vm_state building and task_state spawning. [ 1898.875749] nova-compute[62208]: DEBUG nova.compute.manager [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Received event network-changed-208bbce8-6035-4f3a-8c36-828fb2e53fbf {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1898.875904] nova-compute[62208]: DEBUG nova.compute.manager [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Refreshing instance network info cache due to event network-changed-208bbce8-6035-4f3a-8c36-828fb2e53fbf. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1898.876174] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] Acquiring lock "refresh_cache-911fbbcc-69d5-479f-87f1-2561fcb3dd6b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1898.876338] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] Acquired lock "refresh_cache-911fbbcc-69d5-479f-87f1-2561fcb3dd6b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1898.876501] nova-compute[62208]: DEBUG nova.network.neutron [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Refreshing network info cache for port 208bbce8-6035-4f3a-8c36-828fb2e53fbf {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1899.056441] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1899.056441] nova-compute[62208]: warnings.warn( [ 1899.063135] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38646, 'name': CreateVM_Task, 'duration_secs': 0.301209} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1899.063365] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1899.063989] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1899.064309] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1899.067231] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6c3353e-ee05-4327-97bc-989f333ca26f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1899.089270] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1899.089270] nova-compute[62208]: warnings.warn( [ 1899.121818] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Reconfiguring VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1899.122384] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-e8a8f609-27ce-4e08-a02d-4ecb3315d50a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1899.143828] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1899.143828] nova-compute[62208]: warnings.warn( [ 1899.151634] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for the task: (returnval){ [ 1899.151634] nova-compute[62208]: value = "task-38647" [ 1899.151634] nova-compute[62208]: _type = "Task" [ 1899.151634] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1899.156092] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1899.156092] nova-compute[62208]: warnings.warn( [ 1899.165486] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Task: {'id': task-38647, 'name': ReconfigVM_Task} progress is 10%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1899.186397] nova-compute[62208]: DEBUG nova.network.neutron [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Updated VIF entry in instance network info cache for port 208bbce8-6035-4f3a-8c36-828fb2e53fbf. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1899.186949] nova-compute[62208]: DEBUG nova.network.neutron [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Updating instance_info_cache with network_info: [{"id": "208bbce8-6035-4f3a-8c36-828fb2e53fbf", "address": "fa:16:3e:89:49:72", "network": {"id": "4ff34335-1f93-40d4-a4ee-e581b57a773a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1917389442-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "7a607d976ab14539a9d204c9437c3522", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f5fe645c-e088-401e-ab53-4ae2981dea72", "external-id": "nsx-vlan-transportzone-219", "segmentation_id": 219, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap208bbce8-60", "ovs_interfaceid": "208bbce8-6035-4f3a-8c36-828fb2e53fbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1899.199101] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2f145273-f775-417d-ad8d-da3b26c81a5e req-2772c3e0-99a9-4634-9573-304e1890cbc9 service nova] Releasing lock "refresh_cache-911fbbcc-69d5-479f-87f1-2561fcb3dd6b" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1899.656611] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1899.656611] nova-compute[62208]: warnings.warn( [ 1899.663038] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Task: {'id': task-38647, 'name': ReconfigVM_Task, 'duration_secs': 0.110583} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1899.663325] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Reconfigured VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1899.663544] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.599s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1899.663800] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1899.663943] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1899.664371] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1899.664591] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-597cdb26-cd56-4570-9ca4-558a741478a6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1899.666241] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1899.666241] nova-compute[62208]: warnings.warn( [ 1899.669814] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for the task: (returnval){ [ 1899.669814] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52dc6f48-ca27-6d00-386a-12a171cccc89" [ 1899.669814] nova-compute[62208]: _type = "Task" [ 1899.669814] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1899.673352] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1899.673352] nova-compute[62208]: warnings.warn( [ 1899.680085] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52dc6f48-ca27-6d00-386a-12a171cccc89, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1900.174250] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1900.174250] nova-compute[62208]: warnings.warn( [ 1900.183928] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1900.184473] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1900.184729] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1907.140936] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1908.140566] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1911.141254] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1911.152671] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1911.152671] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1911.152827] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1911.152956] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1911.154016] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-581ce604-0539-4992-81c4-44c4807935a2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.157096] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1911.157096] nova-compute[62208]: warnings.warn( [ 1911.163445] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70fd9825-fcb2-491d-9951-f2c126a4144d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.167313] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1911.167313] nova-compute[62208]: warnings.warn( [ 1911.177831] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcda839d-8c7c-4f02-9f57-e32cf29ef498 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.180197] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1911.180197] nova-compute[62208]: warnings.warn( [ 1911.184923] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04deb204-773f-4af1-9191-ef595d9544ee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.187977] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1911.187977] nova-compute[62208]: warnings.warn( [ 1911.214489] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181949MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1911.214640] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1911.214846] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1911.281347] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 47cd2de6-8094-452e-afd7-aa42128a1b0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1911.281502] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1911.281627] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ad00920b-3783-4c01-bb25-4f923d29dad7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1911.281745] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ca1b4fca-a4bb-4a37-8e88-45e103a3579f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1911.281860] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 68b1024d-2bfd-4999-9ba2-f2558c223885 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1911.281972] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1911.282082] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1911.282192] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1911.282298] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e0a444fc-dca2-419a-9ac1-8d71048e1690 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1911.282404] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 911fbbcc-69d5-479f-87f1-2561fcb3dd6b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1911.293204] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b797610-f460-461c-8c5a-1a28cf162c0e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1911.293456] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1911.293590] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1911.439328] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16bb02e3-0297-4d86-b6af-7003573c2aa1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.441853] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1911.441853] nova-compute[62208]: warnings.warn( [ 1911.446910] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-069db308-e3c4-4403-a90b-140c0c66c48b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.450166] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1911.450166] nova-compute[62208]: warnings.warn( [ 1911.476660] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-722e2413-ba65-4a57-a681-33fe1378a68f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.479092] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1911.479092] nova-compute[62208]: warnings.warn( [ 1911.484546] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3787f942-9a40-49e3-9e1c-231bb19cfb94 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.488190] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1911.488190] nova-compute[62208]: warnings.warn( [ 1911.497581] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1911.506913] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1911.524282] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1911.524543] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.310s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1913.524613] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1913.524911] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1913.525061] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1913.544808] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1913.544964] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1913.545089] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1913.545217] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1913.545342] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1913.545463] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1913.545623] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1913.545737] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1913.545812] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1913.545925] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1913.546062] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1913.546555] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1914.141163] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1914.141404] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1914.141568] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1918.140694] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1918.141080] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1944.942421] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1944.942421] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1944.942421] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1944.942421] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1944.942421] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1944.942421] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1944.942421] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1944.942421] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1944.942421] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1944.942421] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1944.942421] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1944.942421] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1944.943165] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/449aee94-4295-4379-a5dd-0d54fff20c54/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1944.944707] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1944.944943] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Copying Virtual Disk [datastore2] vmware_temp/449aee94-4295-4379-a5dd-0d54fff20c54/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/449aee94-4295-4379-a5dd-0d54fff20c54/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1944.945238] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-76e80ee6-1bce-4b40-ab4f-d8cd993cfd1c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.947581] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1944.947581] nova-compute[62208]: warnings.warn( [ 1944.952904] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Waiting for the task: (returnval){ [ 1944.952904] nova-compute[62208]: value = "task-38648" [ 1944.952904] nova-compute[62208]: _type = "Task" [ 1944.952904] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1944.956335] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1944.956335] nova-compute[62208]: warnings.warn( [ 1944.961542] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Task: {'id': task-38648, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1945.457736] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1945.457736] nova-compute[62208]: warnings.warn( [ 1945.464378] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1945.464684] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1945.465270] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Traceback (most recent call last): [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] yield resources [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] self.driver.spawn(context, instance, image_meta, [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] self._fetch_image_if_missing(context, vi) [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] image_cache(vi, tmp_image_ds_loc) [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] vm_util.copy_virtual_disk( [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] session._wait_for_task(vmdk_copy_task) [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] return self.wait_for_task(task_ref) [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] return evt.wait() [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] result = hub.switch() [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] return self.greenlet.switch() [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] self.f(*self.args, **self.kw) [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] raise exceptions.translate_fault(task_info.error) [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Faults: ['InvalidArgument'] [ 1945.465270] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] [ 1945.466263] nova-compute[62208]: INFO nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Terminating instance [ 1945.468491] nova-compute[62208]: DEBUG nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1945.468681] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1945.468982] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1945.469178] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1945.469929] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fb6f880-6bfc-4220-aacb-d847ac61534a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.472614] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e7af5bd3-de81-483f-a66b-217b47c37894 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.474422] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1945.474422] nova-compute[62208]: warnings.warn( [ 1945.474795] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1945.474795] nova-compute[62208]: warnings.warn( [ 1945.479520] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1945.479896] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d4da8bb5-e982-4116-b6d6-4fb16d6598ba {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.482678] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1945.482864] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1945.483468] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1945.483468] nova-compute[62208]: warnings.warn( [ 1945.483942] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e1298a85-0208-43ff-8ff6-024cc5173eec {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.486127] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1945.486127] nova-compute[62208]: warnings.warn( [ 1945.489287] nova-compute[62208]: DEBUG oslo_vmware.api [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 1945.489287] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5214198d-fd17-86af-abcb-44be7684d20d" [ 1945.489287] nova-compute[62208]: _type = "Task" [ 1945.489287] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1945.492301] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1945.492301] nova-compute[62208]: warnings.warn( [ 1945.497488] nova-compute[62208]: DEBUG oslo_vmware.api [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5214198d-fd17-86af-abcb-44be7684d20d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1945.550087] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1945.550315] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1945.550538] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Deleting the datastore file [datastore2] 47cd2de6-8094-452e-afd7-aa42128a1b0c {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1945.550820] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-32ba59aa-4fa8-4fb1-8742-efbc4deacead {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.552930] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1945.552930] nova-compute[62208]: warnings.warn( [ 1945.558224] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Waiting for the task: (returnval){ [ 1945.558224] nova-compute[62208]: value = "task-38650" [ 1945.558224] nova-compute[62208]: _type = "Task" [ 1945.558224] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1945.562714] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1945.562714] nova-compute[62208]: warnings.warn( [ 1945.568331] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Task: {'id': task-38650, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1945.993405] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1945.993405] nova-compute[62208]: warnings.warn( [ 1945.999522] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1945.999906] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Creating directory with path [datastore2] vmware_temp/693fd7b1-c01f-4e13-883f-b47a5994eaf9/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1946.000230] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-44805826-1643-473b-a75d-ca34aef676c2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.002193] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1946.002193] nova-compute[62208]: warnings.warn( [ 1946.013342] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Created directory with path [datastore2] vmware_temp/693fd7b1-c01f-4e13-883f-b47a5994eaf9/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1946.013671] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Fetch image to [datastore2] vmware_temp/693fd7b1-c01f-4e13-883f-b47a5994eaf9/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1946.013913] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/693fd7b1-c01f-4e13-883f-b47a5994eaf9/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1946.014767] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0f1e0e5-a1f7-4cf5-a9d4-0339b21e07a6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.017305] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1946.017305] nova-compute[62208]: warnings.warn( [ 1946.022267] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8567b708-00b6-4293-b8d5-f32f763bb038 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.024626] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1946.024626] nova-compute[62208]: warnings.warn( [ 1946.032451] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6570b0f8-6e61-48f5-a439-3be16fa6098b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.036341] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1946.036341] nova-compute[62208]: warnings.warn( [ 1946.067523] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c88975b8-170f-4614-9c2f-6c8889a54aee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.069964] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1946.069964] nova-compute[62208]: warnings.warn( [ 1946.070483] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1946.070483] nova-compute[62208]: warnings.warn( [ 1946.077447] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-676b3b01-3b42-49da-962f-2eb6b4351b84 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.079573] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Task: {'id': task-38650, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081846} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1946.079898] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1946.080172] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1946.080477] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1946.080788] nova-compute[62208]: INFO nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1946.082859] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1946.082859] nova-compute[62208]: warnings.warn( [ 1946.083754] nova-compute[62208]: DEBUG nova.compute.claims [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9371831f0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1946.084022] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1946.084330] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1946.103146] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1946.157439] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/693fd7b1-c01f-4e13-883f-b47a5994eaf9/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1946.218770] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1946.219029] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/693fd7b1-c01f-4e13-883f-b47a5994eaf9/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1946.338803] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a22e23f7-b256-47c4-b572-5479856e92a7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.341376] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1946.341376] nova-compute[62208]: warnings.warn( [ 1946.347134] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f219985-90e8-428f-94aa-ea8829fd972b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.350111] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1946.350111] nova-compute[62208]: warnings.warn( [ 1946.377580] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92ef5948-654e-478f-93bd-3f3cb09625d7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.380124] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1946.380124] nova-compute[62208]: warnings.warn( [ 1946.386215] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e92aec65-3b11-45c2-b568-471492942144 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.390359] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1946.390359] nova-compute[62208]: warnings.warn( [ 1946.400835] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1946.410287] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1946.428938] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.344s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1946.429500] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Traceback (most recent call last): [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] self.driver.spawn(context, instance, image_meta, [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] self._fetch_image_if_missing(context, vi) [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] image_cache(vi, tmp_image_ds_loc) [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] vm_util.copy_virtual_disk( [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] session._wait_for_task(vmdk_copy_task) [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] return self.wait_for_task(task_ref) [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] return evt.wait() [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] result = hub.switch() [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] return self.greenlet.switch() [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] self.f(*self.args, **self.kw) [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] raise exceptions.translate_fault(task_info.error) [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Faults: ['InvalidArgument'] [ 1946.429500] nova-compute[62208]: ERROR nova.compute.manager [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] [ 1946.430413] nova-compute[62208]: DEBUG nova.compute.utils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1946.432116] nova-compute[62208]: DEBUG nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Build of instance 47cd2de6-8094-452e-afd7-aa42128a1b0c was re-scheduled: A specified parameter was not correct: fileType [ 1946.432116] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1946.432508] nova-compute[62208]: DEBUG nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1946.432812] nova-compute[62208]: DEBUG nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1946.433034] nova-compute[62208]: DEBUG nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1946.433245] nova-compute[62208]: DEBUG nova.network.neutron [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1946.817515] nova-compute[62208]: DEBUG nova.network.neutron [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1946.834136] nova-compute[62208]: INFO nova.compute.manager [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Took 0.40 seconds to deallocate network for instance. [ 1946.933055] nova-compute[62208]: INFO nova.scheduler.client.report [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Deleted allocations for instance 47cd2de6-8094-452e-afd7-aa42128a1b0c [ 1946.952092] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b31b9af3-67b4-4d42-82c2-188cb56771db tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Lock "47cd2de6-8094-452e-afd7-aa42128a1b0c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 629.333s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1946.954058] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1668e682-8750-440a-bc2b-0fc26fcec522 tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Lock "47cd2de6-8094-452e-afd7-aa42128a1b0c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 433.400s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1946.954058] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1668e682-8750-440a-bc2b-0fc26fcec522 tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Acquiring lock "47cd2de6-8094-452e-afd7-aa42128a1b0c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1946.954058] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1668e682-8750-440a-bc2b-0fc26fcec522 tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Lock "47cd2de6-8094-452e-afd7-aa42128a1b0c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1946.954286] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1668e682-8750-440a-bc2b-0fc26fcec522 tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Lock "47cd2de6-8094-452e-afd7-aa42128a1b0c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1946.956404] nova-compute[62208]: INFO nova.compute.manager [None req-1668e682-8750-440a-bc2b-0fc26fcec522 tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Terminating instance [ 1946.958375] nova-compute[62208]: DEBUG nova.compute.manager [None req-1668e682-8750-440a-bc2b-0fc26fcec522 tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1946.958872] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-1668e682-8750-440a-bc2b-0fc26fcec522 tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1946.959339] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bc0e12b9-6320-4314-813b-140a96a8b5e1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.962521] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1946.962521] nova-compute[62208]: warnings.warn( [ 1946.970968] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a922886-3874-416e-82da-e3ba14091780 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.983148] nova-compute[62208]: DEBUG nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1946.985807] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1946.985807] nova-compute[62208]: warnings.warn( [ 1947.007391] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-1668e682-8750-440a-bc2b-0fc26fcec522 tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 47cd2de6-8094-452e-afd7-aa42128a1b0c could not be found. [ 1947.007713] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-1668e682-8750-440a-bc2b-0fc26fcec522 tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1947.007756] nova-compute[62208]: INFO nova.compute.manager [None req-1668e682-8750-440a-bc2b-0fc26fcec522 tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1947.008043] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-1668e682-8750-440a-bc2b-0fc26fcec522 tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1947.008329] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1947.008425] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1947.040931] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1947.043011] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1947.043300] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1947.045189] nova-compute[62208]: INFO nova.compute.claims [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1947.056340] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] Took 0.05 seconds to deallocate network for instance. [ 1947.156622] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-1668e682-8750-440a-bc2b-0fc26fcec522 tempest-ServersNegativeTestJSON-1891512129 tempest-ServersNegativeTestJSON-1891512129-project-member] Lock "47cd2de6-8094-452e-afd7-aa42128a1b0c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.203s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1947.157493] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "47cd2de6-8094-452e-afd7-aa42128a1b0c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 147.751s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1947.157709] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 47cd2de6-8094-452e-afd7-aa42128a1b0c] During sync_power_state the instance has a pending task (deleting). Skip. [ 1947.157853] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "47cd2de6-8094-452e-afd7-aa42128a1b0c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1947.253713] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5815eca-2e54-4546-9d8c-faeeca76504d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.256293] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1947.256293] nova-compute[62208]: warnings.warn( [ 1947.261589] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4abad4f9-7437-429c-9518-b67ea577a917 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.264686] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1947.264686] nova-compute[62208]: warnings.warn( [ 1947.296840] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dfbf85f-66f8-407f-a366-c031f3383f83 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.302837] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1947.302837] nova-compute[62208]: warnings.warn( [ 1947.308590] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a18f81c3-dd81-4e46-b0e4-42f8a87b5de3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.312429] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1947.312429] nova-compute[62208]: warnings.warn( [ 1947.322446] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1947.331088] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1947.350111] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.307s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1947.350632] nova-compute[62208]: DEBUG nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1947.386911] nova-compute[62208]: DEBUG nova.compute.utils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1947.389043] nova-compute[62208]: DEBUG nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1947.389726] nova-compute[62208]: DEBUG nova.network.neutron [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1947.401450] nova-compute[62208]: DEBUG nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1947.445176] nova-compute[62208]: DEBUG nova.policy [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6cb3f0377ac64412bf238ba3e97ecd9a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4fb2ff705fe34117b2dfb9354ae8cfc8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1947.477889] nova-compute[62208]: DEBUG nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1947.500397] nova-compute[62208]: DEBUG nova.virt.hardware [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1947.500627] nova-compute[62208]: DEBUG nova.virt.hardware [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1947.500781] nova-compute[62208]: DEBUG nova.virt.hardware [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1947.500957] nova-compute[62208]: DEBUG nova.virt.hardware [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1947.501100] nova-compute[62208]: DEBUG nova.virt.hardware [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1947.501245] nova-compute[62208]: DEBUG nova.virt.hardware [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1947.501443] nova-compute[62208]: DEBUG nova.virt.hardware [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1947.501599] nova-compute[62208]: DEBUG nova.virt.hardware [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1947.501762] nova-compute[62208]: DEBUG nova.virt.hardware [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1947.501920] nova-compute[62208]: DEBUG nova.virt.hardware [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1947.502084] nova-compute[62208]: DEBUG nova.virt.hardware [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1947.503147] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35c0caab-c913-4274-bd24-89405fdc8ed8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.505823] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1947.505823] nova-compute[62208]: warnings.warn( [ 1947.511680] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1988cf2a-af87-4d16-bd86-15f953ac73bd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.515836] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1947.515836] nova-compute[62208]: warnings.warn( [ 1947.713368] nova-compute[62208]: DEBUG nova.network.neutron [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Successfully created port: 79014c0d-af0e-4b99-b103-a613baf4fefa {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1948.602837] nova-compute[62208]: DEBUG nova.network.neutron [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Successfully updated port: 79014c0d-af0e-4b99-b103-a613baf4fefa {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1948.613922] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "refresh_cache-5b797610-f460-461c-8c5a-1a28cf162c0e" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1948.614076] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "refresh_cache-5b797610-f460-461c-8c5a-1a28cf162c0e" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1948.614233] nova-compute[62208]: DEBUG nova.network.neutron [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1948.655688] nova-compute[62208]: DEBUG nova.network.neutron [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1948.808244] nova-compute[62208]: DEBUG nova.network.neutron [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Updating instance_info_cache with network_info: [{"id": "79014c0d-af0e-4b99-b103-a613baf4fefa", "address": "fa:16:3e:a1:bf:29", "network": {"id": "ce724fe2-66d9-4ed9-8fe0-d32189d49488", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1765832645-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "4fb2ff705fe34117b2dfb9354ae8cfc8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13af9422-d668-4413-b63a-766558d83a3b", "external-id": "nsx-vlan-transportzone-842", "segmentation_id": 842, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap79014c0d-af", "ovs_interfaceid": "79014c0d-af0e-4b99-b103-a613baf4fefa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1948.822285] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "refresh_cache-5b797610-f460-461c-8c5a-1a28cf162c0e" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1948.822730] nova-compute[62208]: DEBUG nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Instance network_info: |[{"id": "79014c0d-af0e-4b99-b103-a613baf4fefa", "address": "fa:16:3e:a1:bf:29", "network": {"id": "ce724fe2-66d9-4ed9-8fe0-d32189d49488", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1765832645-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "4fb2ff705fe34117b2dfb9354ae8cfc8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13af9422-d668-4413-b63a-766558d83a3b", "external-id": "nsx-vlan-transportzone-842", "segmentation_id": 842, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap79014c0d-af", "ovs_interfaceid": "79014c0d-af0e-4b99-b103-a613baf4fefa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1948.823797] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a1:bf:29', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '13af9422-d668-4413-b63a-766558d83a3b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '79014c0d-af0e-4b99-b103-a613baf4fefa', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1948.832434] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1948.833473] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1948.833726] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1d1feb52-6c5e-4985-a946-4c39e0056e31 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1948.848166] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1948.848166] nova-compute[62208]: warnings.warn( [ 1948.853968] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1948.853968] nova-compute[62208]: value = "task-38651" [ 1948.853968] nova-compute[62208]: _type = "Task" [ 1948.853968] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1948.858795] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1948.858795] nova-compute[62208]: warnings.warn( [ 1948.863865] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38651, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1948.877506] nova-compute[62208]: DEBUG nova.compute.manager [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Received event network-vif-plugged-79014c0d-af0e-4b99-b103-a613baf4fefa {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1948.877735] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] Acquiring lock "5b797610-f460-461c-8c5a-1a28cf162c0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1948.878027] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] Lock "5b797610-f460-461c-8c5a-1a28cf162c0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1948.878149] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] Lock "5b797610-f460-461c-8c5a-1a28cf162c0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1948.878294] nova-compute[62208]: DEBUG nova.compute.manager [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] No waiting events found dispatching network-vif-plugged-79014c0d-af0e-4b99-b103-a613baf4fefa {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1948.878459] nova-compute[62208]: WARNING nova.compute.manager [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Received unexpected event network-vif-plugged-79014c0d-af0e-4b99-b103-a613baf4fefa for instance with vm_state building and task_state spawning. [ 1948.878620] nova-compute[62208]: DEBUG nova.compute.manager [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Received event network-changed-79014c0d-af0e-4b99-b103-a613baf4fefa {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1948.878773] nova-compute[62208]: DEBUG nova.compute.manager [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Refreshing instance network info cache due to event network-changed-79014c0d-af0e-4b99-b103-a613baf4fefa. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1948.878951] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] Acquiring lock "refresh_cache-5b797610-f460-461c-8c5a-1a28cf162c0e" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1948.879091] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] Acquired lock "refresh_cache-5b797610-f460-461c-8c5a-1a28cf162c0e" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1948.879275] nova-compute[62208]: DEBUG nova.network.neutron [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Refreshing network info cache for port 79014c0d-af0e-4b99-b103-a613baf4fefa {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1949.261345] nova-compute[62208]: DEBUG nova.network.neutron [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Updated VIF entry in instance network info cache for port 79014c0d-af0e-4b99-b103-a613baf4fefa. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1949.261711] nova-compute[62208]: DEBUG nova.network.neutron [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Updating instance_info_cache with network_info: [{"id": "79014c0d-af0e-4b99-b103-a613baf4fefa", "address": "fa:16:3e:a1:bf:29", "network": {"id": "ce724fe2-66d9-4ed9-8fe0-d32189d49488", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1765832645-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "4fb2ff705fe34117b2dfb9354ae8cfc8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13af9422-d668-4413-b63a-766558d83a3b", "external-id": "nsx-vlan-transportzone-842", "segmentation_id": 842, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap79014c0d-af", "ovs_interfaceid": "79014c0d-af0e-4b99-b103-a613baf4fefa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1949.271898] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f02f1565-b3d2-474b-9abb-4a2ab0cd99d5 req-d5d50fec-20bf-4752-8393-38cd61c374a6 service nova] Releasing lock "refresh_cache-5b797610-f460-461c-8c5a-1a28cf162c0e" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1949.358556] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1949.358556] nova-compute[62208]: warnings.warn( [ 1949.364348] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38651, 'name': CreateVM_Task, 'duration_secs': 0.308581} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1949.364527] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1949.365118] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1949.365353] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1949.368188] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccec0446-b4c5-4ca1-a79e-9c91d10c7f1b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1949.378298] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1949.378298] nova-compute[62208]: warnings.warn( [ 1949.400989] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Reconfiguring VM instance to enable vnc on port - 5909 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 1949.401683] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-cf431de2-d06e-4cb7-9cd2-167e308b5b59 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1949.411993] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1949.411993] nova-compute[62208]: warnings.warn( [ 1949.417799] nova-compute[62208]: DEBUG oslo_vmware.api [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 1949.417799] nova-compute[62208]: value = "task-38652" [ 1949.417799] nova-compute[62208]: _type = "Task" [ 1949.417799] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1949.420949] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1949.420949] nova-compute[62208]: warnings.warn( [ 1949.428489] nova-compute[62208]: DEBUG oslo_vmware.api [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38652, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1949.922086] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1949.922086] nova-compute[62208]: warnings.warn( [ 1949.928412] nova-compute[62208]: DEBUG oslo_vmware.api [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38652, 'name': ReconfigVM_Task, 'duration_secs': 0.108975} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1949.928656] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Reconfigured VM instance to enable vnc on port - 5909 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 1949.928870] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.564s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1949.929149] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1949.929359] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1949.929699] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1949.929958] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d2de56ef-2cbe-4b93-83f4-cbb94bd737a0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1949.931508] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1949.931508] nova-compute[62208]: warnings.warn( [ 1949.934749] nova-compute[62208]: DEBUG oslo_vmware.api [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 1949.934749] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52882044-f193-5472-9904-8e03e6a0edaf" [ 1949.934749] nova-compute[62208]: _type = "Task" [ 1949.934749] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1949.937702] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1949.937702] nova-compute[62208]: warnings.warn( [ 1949.943094] nova-compute[62208]: DEBUG oslo_vmware.api [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52882044-f193-5472-9904-8e03e6a0edaf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1950.438969] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1950.438969] nova-compute[62208]: warnings.warn( [ 1950.445951] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1950.446259] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1950.446481] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1952.979864] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-50660845-8d2f-4a33-8080-a745c07882df tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "e0a444fc-dca2-419a-9ac1-8d71048e1690" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1954.512165] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1954.512464] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1968.141692] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1969.142279] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1973.141465] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1973.141793] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 1973.141793] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 1973.161712] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1973.161854] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1973.161984] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1973.162113] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1973.162240] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1973.162363] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1973.162490] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1973.162604] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1973.162722] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1973.162839] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 1973.162959] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 1973.163462] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1973.163659] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1973.172941] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1973.173153] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1973.173319] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1973.173470] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1973.174517] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a78d62b-1ec5-4959-a0b0-b39ad531170d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.177585] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1973.177585] nova-compute[62208]: warnings.warn( [ 1973.183558] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fb88dbb-f0fd-43a6-8f4b-5b1af4563f0d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.187211] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1973.187211] nova-compute[62208]: warnings.warn( [ 1973.197637] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2e32125-fd28-4995-921a-c36d0c8ac9fa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.199815] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1973.199815] nova-compute[62208]: warnings.warn( [ 1973.203979] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05a254a9-cdc4-497c-beff-4e3e45274fd0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.206919] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1973.206919] nova-compute[62208]: warnings.warn( [ 1973.233813] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181949MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1973.233959] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1973.234151] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1973.298249] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1973.298419] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ad00920b-3783-4c01-bb25-4f923d29dad7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1973.298548] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ca1b4fca-a4bb-4a37-8e88-45e103a3579f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1973.298668] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 68b1024d-2bfd-4999-9ba2-f2558c223885 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1973.298786] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1973.298902] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1973.299017] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1973.299159] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e0a444fc-dca2-419a-9ac1-8d71048e1690 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1973.299286] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 911fbbcc-69d5-479f-87f1-2561fcb3dd6b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1973.299404] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b797610-f460-461c-8c5a-1a28cf162c0e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1973.310046] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1973.310262] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1973.310411] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1973.449230] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66a26456-4476-4159-8222-2cc25bcb759a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.451648] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1973.451648] nova-compute[62208]: warnings.warn( [ 1973.456965] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b65b47b8-b917-4b9f-a4ca-ac03b6631522 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.460459] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1973.460459] nova-compute[62208]: warnings.warn( [ 1973.488453] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a52c446-32f4-414a-b912-2f5d15b07463 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.491325] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1973.491325] nova-compute[62208]: warnings.warn( [ 1973.496393] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0963ab2-a773-4630-941a-4fdf41858a42 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.500931] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1973.500931] nova-compute[62208]: warnings.warn( [ 1973.511123] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1973.519348] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1973.536502] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1973.536716] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.303s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1974.514366] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1976.136609] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1976.140487] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1978.141238] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1978.141670] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 1985.136636] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1990.651226] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "ec568c91-b110-4c2a-8d62-8127c7781d03" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1990.651639] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "ec568c91-b110-4c2a-8d62-8127c7781d03" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1995.686915] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1995.686915] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1995.686915] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1995.686915] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1995.686915] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1995.686915] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 1995.686915] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1995.686915] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1995.686915] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1995.686915] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1995.686915] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1995.686915] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 1995.687629] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/693fd7b1-c01f-4e13-883f-b47a5994eaf9/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1995.689219] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1995.689474] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Copying Virtual Disk [datastore2] vmware_temp/693fd7b1-c01f-4e13-883f-b47a5994eaf9/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/693fd7b1-c01f-4e13-883f-b47a5994eaf9/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1995.689766] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1cdae3af-0e6e-417c-b505-2db07c3655fa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1995.692013] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1995.692013] nova-compute[62208]: warnings.warn( [ 1995.697213] nova-compute[62208]: DEBUG oslo_vmware.api [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 1995.697213] nova-compute[62208]: value = "task-38653" [ 1995.697213] nova-compute[62208]: _type = "Task" [ 1995.697213] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1995.700760] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1995.700760] nova-compute[62208]: warnings.warn( [ 1995.705886] nova-compute[62208]: DEBUG oslo_vmware.api [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38653, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1996.201997] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.201997] nova-compute[62208]: warnings.warn( [ 1996.208219] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1996.208536] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1996.209225] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Traceback (most recent call last): [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] yield resources [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] self.driver.spawn(context, instance, image_meta, [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] self._fetch_image_if_missing(context, vi) [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] image_cache(vi, tmp_image_ds_loc) [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] vm_util.copy_virtual_disk( [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] session._wait_for_task(vmdk_copy_task) [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] return self.wait_for_task(task_ref) [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] return evt.wait() [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] result = hub.switch() [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] return self.greenlet.switch() [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] self.f(*self.args, **self.kw) [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] raise exceptions.translate_fault(task_info.error) [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Faults: ['InvalidArgument'] [ 1996.209225] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] [ 1996.210084] nova-compute[62208]: INFO nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Terminating instance [ 1996.211671] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1996.211921] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1996.212210] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3ed3afa2-c416-4d98-aaca-c31f55a6d82f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.214634] nova-compute[62208]: DEBUG nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1996.214872] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1996.215619] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c49e1e9-9aa9-44bd-9a8b-90a2d0df6c49 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.217881] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.217881] nova-compute[62208]: warnings.warn( [ 1996.218268] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.218268] nova-compute[62208]: warnings.warn( [ 1996.222715] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1996.223025] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0b0ac7c6-be6c-43a5-b5dc-e786e3e967d3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.225306] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1996.225532] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1996.226143] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.226143] nova-compute[62208]: warnings.warn( [ 1996.226593] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-95d8a711-2413-4d23-bc69-4288e8cbd869 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.228845] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.228845] nova-compute[62208]: warnings.warn( [ 1996.231951] nova-compute[62208]: DEBUG oslo_vmware.api [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Waiting for the task: (returnval){ [ 1996.231951] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]523d38a0-4ea7-6af8-c260-28fe9f136860" [ 1996.231951] nova-compute[62208]: _type = "Task" [ 1996.231951] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1996.234857] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.234857] nova-compute[62208]: warnings.warn( [ 1996.240245] nova-compute[62208]: DEBUG oslo_vmware.api [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]523d38a0-4ea7-6af8-c260-28fe9f136860, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1996.287906] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1996.288260] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1996.288509] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Deleting the datastore file [datastore2] 7311ba0c-9a1b-4482-a4eb-6afe993e6656 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1996.288824] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cad99dcd-6c16-4ef6-83d4-194edc1c1310 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.290839] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.290839] nova-compute[62208]: warnings.warn( [ 1996.295283] nova-compute[62208]: DEBUG oslo_vmware.api [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 1996.295283] nova-compute[62208]: value = "task-38655" [ 1996.295283] nova-compute[62208]: _type = "Task" [ 1996.295283] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1996.298728] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.298728] nova-compute[62208]: warnings.warn( [ 1996.303370] nova-compute[62208]: DEBUG oslo_vmware.api [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38655, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1996.736552] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.736552] nova-compute[62208]: warnings.warn( [ 1996.743015] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1996.743370] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Creating directory with path [datastore2] vmware_temp/00a3a618-5c21-4a1a-839b-ba4af9d5e305/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1996.743660] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3d31fc74-297d-4f38-a7bc-4f87ffa4ed55 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.745650] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.745650] nova-compute[62208]: warnings.warn( [ 1996.755778] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Created directory with path [datastore2] vmware_temp/00a3a618-5c21-4a1a-839b-ba4af9d5e305/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1996.756115] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Fetch image to [datastore2] vmware_temp/00a3a618-5c21-4a1a-839b-ba4af9d5e305/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1996.756328] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/00a3a618-5c21-4a1a-839b-ba4af9d5e305/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1996.757140] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0a7643c-b000-4c66-8592-088d15e458a7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.759595] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.759595] nova-compute[62208]: warnings.warn( [ 1996.764293] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dcdb3b7-eafb-458d-a872-029e17d36029 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.766683] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.766683] nova-compute[62208]: warnings.warn( [ 1996.773983] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07ae7b1d-4689-4b0e-b5bc-8cee6250473a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.777797] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.777797] nova-compute[62208]: warnings.warn( [ 1996.810344] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b108fb3-390d-4d5c-88a5-58fb756e215d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.812600] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.812600] nova-compute[62208]: warnings.warn( [ 1996.813029] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.813029] nova-compute[62208]: warnings.warn( [ 1996.820888] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0ec876af-8658-4136-8485-d59b0a787282 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1996.822852] nova-compute[62208]: DEBUG oslo_vmware.api [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38655, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075086} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1996.823168] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1996.823389] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1996.823601] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1996.823813] nova-compute[62208]: INFO nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1996.825486] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1996.825486] nova-compute[62208]: warnings.warn( [ 1996.826085] nova-compute[62208]: DEBUG nova.compute.claims [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Aborting claim: <nova.compute.claims.Claim object at 0x7fb935fc6dd0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1996.826409] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1996.826674] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1996.845356] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1996.999645] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Traceback (most recent call last): [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] result = getattr(controller, method)(*args, **kwargs) [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self._get(image_id) [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] resp, body = self.http_client.get(url, headers=header) [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 393, in get [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self.request(url, 'GET', **kwargs) [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self._handle_response(resp) [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise exc.from_response(resp, resp.content) [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] During handling of the above exception, another exception occurred: [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Traceback (most recent call last): [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] yield resources [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self.driver.spawn(context, instance, image_meta, [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._fetch_image_if_missing(context, vi) [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] image_fetch(context, vi, tmp_image_ds_loc) [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] images.fetch_image( [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1997.000453] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] metadata = IMAGE_API.get(context, image_ref) [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 1206, in get [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return session.show(context, image_id, [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] _reraise_translated_image_exception(image_id) [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 1032, in _reraise_translated_image_exception [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise new_exc.with_traceback(exc_trace) [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] result = getattr(controller, method)(*args, **kwargs) [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self._get(image_id) [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] resp, body = self.http_client.get(url, headers=header) [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 393, in get [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self.request(url, 'GET', **kwargs) [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self._handle_response(resp) [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise exc.from_response(resp, resp.content) [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] nova.exception.ImageNotAuthorized: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1997.001494] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1997.001494] nova-compute[62208]: INFO nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Terminating instance [ 1997.002225] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1997.002615] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1997.002693] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5a8ad864-0df9-41ff-bedb-ebf02a682b8e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.005220] nova-compute[62208]: DEBUG nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1997.005475] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1997.006565] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aa6d1c4-4904-4829-a0c1-7af69cbab9d5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.011977] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.011977] nova-compute[62208]: warnings.warn( [ 1997.012465] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.012465] nova-compute[62208]: warnings.warn( [ 1997.016826] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1997.017079] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3fa6917c-78d6-45fe-a399-f3e80790e42a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.019494] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1997.019671] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1997.020368] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.020368] nova-compute[62208]: warnings.warn( [ 1997.020695] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-49100579-e671-4a94-ba0b-44a3808d4334 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.024858] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.024858] nova-compute[62208]: warnings.warn( [ 1997.027976] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 1997.027976] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5264813e-eb9d-c419-d983-8335774c993f" [ 1997.027976] nova-compute[62208]: _type = "Task" [ 1997.027976] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1997.035732] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.035732] nova-compute[62208]: warnings.warn( [ 1997.039654] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5264813e-eb9d-c419-d983-8335774c993f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1997.040460] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad70d890-9f52-4c91-ab6a-3c8d6dcac879 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.042754] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.042754] nova-compute[62208]: warnings.warn( [ 1997.047128] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f90b7e09-9129-43dd-a61b-31da4ac31c9f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.050227] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.050227] nova-compute[62208]: warnings.warn( [ 1997.082433] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df5d4022-fc69-4854-b0e7-61535a31268b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.084946] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.084946] nova-compute[62208]: warnings.warn( [ 1997.090184] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da5c5b24-646d-4d66-8997-71e4730b8ab7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.093895] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.093895] nova-compute[62208]: warnings.warn( [ 1997.103945] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1997.113628] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1997.131303] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.304s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1997.131964] nova-compute[62208]: Faults: ['InvalidArgument'] [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Traceback (most recent call last): [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] self.driver.spawn(context, instance, image_meta, [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] self._fetch_image_if_missing(context, vi) [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] image_cache(vi, tmp_image_ds_loc) [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] vm_util.copy_virtual_disk( [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] session._wait_for_task(vmdk_copy_task) [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] return self.wait_for_task(task_ref) [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] return evt.wait() [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] result = hub.switch() [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] return self.greenlet.switch() [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] self.f(*self.args, **self.kw) [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] raise exceptions.translate_fault(task_info.error) [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Faults: ['InvalidArgument'] [ 1997.131964] nova-compute[62208]: ERROR nova.compute.manager [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] [ 1997.132847] nova-compute[62208]: DEBUG nova.compute.utils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1997.134178] nova-compute[62208]: DEBUG nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Build of instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 was re-scheduled: A specified parameter was not correct: fileType [ 1997.134178] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1997.134562] nova-compute[62208]: DEBUG nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1997.134738] nova-compute[62208]: DEBUG nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1997.134954] nova-compute[62208]: DEBUG nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1997.135069] nova-compute[62208]: DEBUG nova.network.neutron [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1997.159877] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1997.160318] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1997.160555] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Deleting the datastore file [datastore2] ad00920b-3783-4c01-bb25-4f923d29dad7 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1997.160846] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4d1e7c72-eda3-45b2-9927-9f04018c79af {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.163523] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.163523] nova-compute[62208]: warnings.warn( [ 1997.169497] nova-compute[62208]: DEBUG oslo_vmware.api [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Waiting for the task: (returnval){ [ 1997.169497] nova-compute[62208]: value = "task-38657" [ 1997.169497] nova-compute[62208]: _type = "Task" [ 1997.169497] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1997.174902] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.174902] nova-compute[62208]: warnings.warn( [ 1997.181098] nova-compute[62208]: DEBUG oslo_vmware.api [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Task: {'id': task-38657, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1997.405193] nova-compute[62208]: DEBUG nova.network.neutron [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1997.419429] nova-compute[62208]: INFO nova.compute.manager [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Took 0.28 seconds to deallocate network for instance. [ 1997.525096] nova-compute[62208]: INFO nova.scheduler.client.report [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Deleted allocations for instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 [ 1997.534316] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.534316] nova-compute[62208]: warnings.warn( [ 1997.542244] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1997.543026] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating directory with path [datastore2] vmware_temp/f757115c-1150-4f0a-9d94-773442ffd7dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1997.543305] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e7bba7d2-8208-4653-8ae4-3f609a4df2d1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.545585] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.545585] nova-compute[62208]: warnings.warn( [ 1997.547156] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-749f5d95-ad12-4300-826e-48971f0d6f74 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 635.601s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1997.548331] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-fa863e60-41cb-4576-ba98-5e673b6e51cb tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 438.979s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1997.548551] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-fa863e60-41cb-4576-ba98-5e673b6e51cb tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1997.548766] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-fa863e60-41cb-4576-ba98-5e673b6e51cb tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1997.548938] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-fa863e60-41cb-4576-ba98-5e673b6e51cb tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1997.550980] nova-compute[62208]: INFO nova.compute.manager [None req-fa863e60-41cb-4576-ba98-5e673b6e51cb tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Terminating instance [ 1997.552772] nova-compute[62208]: DEBUG nova.compute.manager [None req-fa863e60-41cb-4576-ba98-5e673b6e51cb tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1997.552966] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-fa863e60-41cb-4576-ba98-5e673b6e51cb tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1997.553425] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-061e797f-70fb-4547-b924-b6aa3566bc5b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.556688] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Created directory with path [datastore2] vmware_temp/f757115c-1150-4f0a-9d94-773442ffd7dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1997.556883] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Fetch image to [datastore2] vmware_temp/f757115c-1150-4f0a-9d94-773442ffd7dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1997.557059] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/f757115c-1150-4f0a-9d94-773442ffd7dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1997.557217] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.557217] nova-compute[62208]: warnings.warn( [ 1997.558229] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ba0058a-0b78-4f3a-8fa3-989b1c3f6812 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.561735] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.561735] nova-compute[62208]: warnings.warn( [ 1997.564879] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7128eb96-5752-4430-8880-1f30772c46a3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.578362] nova-compute[62208]: DEBUG nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1997.580773] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.580773] nova-compute[62208]: warnings.warn( [ 1997.581793] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18a31a33-df34-4673-8294-24a22cfdd0ed {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.586144] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.586144] nova-compute[62208]: warnings.warn( [ 1997.608568] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42f2b3c2-cc4f-4498-b531-5d33ed044301 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.613408] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-fa863e60-41cb-4576-ba98-5e673b6e51cb tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7311ba0c-9a1b-4482-a4eb-6afe993e6656 could not be found. [ 1997.613620] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-fa863e60-41cb-4576-ba98-5e673b6e51cb tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1997.613801] nova-compute[62208]: INFO nova.compute.manager [None req-fa863e60-41cb-4576-ba98-5e673b6e51cb tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Took 0.06 seconds to destroy the instance on the hypervisor. [ 1997.614050] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-fa863e60-41cb-4576-ba98-5e673b6e51cb tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1997.614289] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1997.614386] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1997.616438] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.616438] nova-compute[62208]: warnings.warn( [ 1997.649143] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1997.649499] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1997.651228] nova-compute[62208]: INFO nova.compute.claims [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1997.654181] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66e8c5a8-82de-4eae-b26a-a407c31616d5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.658661] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.658661] nova-compute[62208]: warnings.warn( [ 1997.663289] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7ecddec6-0c18-491b-b22d-133d594018bf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.667537] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1997.668604] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.668604] nova-compute[62208]: warnings.warn( [ 1997.672260] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.672260] nova-compute[62208]: warnings.warn( [ 1997.679055] nova-compute[62208]: DEBUG oslo_vmware.api [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Task: {'id': task-38657, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080663} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1997.679348] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1997.679552] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1997.679751] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1997.679939] nova-compute[62208]: INFO nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Took 0.67 seconds to destroy the instance on the hypervisor. [ 1997.682124] nova-compute[62208]: DEBUG nova.compute.claims [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Aborting claim: <nova.compute.claims.Claim object at 0x7fb935e969b0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1997.682306] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1997.683158] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] Took 0.07 seconds to deallocate network for instance. [ 1997.691964] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1997.743692] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f757115c-1150-4f0a-9d94-773442ffd7dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1997.804790] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-fa863e60-41cb-4576-ba98-5e673b6e51cb tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.256s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1997.805698] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 198.399s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1997.805883] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 7311ba0c-9a1b-4482-a4eb-6afe993e6656] During sync_power_state the instance has a pending task (deleting). Skip. [ 1997.806059] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "7311ba0c-9a1b-4482-a4eb-6afe993e6656" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1997.808769] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1997.808989] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f757115c-1150-4f0a-9d94-773442ffd7dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1997.903935] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d6beac7-5f62-491a-8213-bc0b7f5a497c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.906694] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.906694] nova-compute[62208]: warnings.warn( [ 1997.911830] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f1dc185-a3cf-4801-96a6-539ff1fdd855 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.916130] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.916130] nova-compute[62208]: warnings.warn( [ 1997.941921] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-909d9560-d8f6-4acb-b878-16db687216ea {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.944294] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.944294] nova-compute[62208]: warnings.warn( [ 1997.949414] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aaac939-8846-40b1-873d-cade819013c4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1997.953271] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1997.953271] nova-compute[62208]: warnings.warn( [ 1997.962665] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1997.971709] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1997.988610] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.339s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1997.989114] nova-compute[62208]: DEBUG nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1997.991499] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.309s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1998.030983] nova-compute[62208]: DEBUG nova.compute.utils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1998.036696] nova-compute[62208]: DEBUG nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1998.036781] nova-compute[62208]: DEBUG nova.network.neutron [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1998.048147] nova-compute[62208]: DEBUG nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1998.088966] nova-compute[62208]: DEBUG nova.policy [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7534a5a8a37e4451918e35c8b93d4ad5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8eef1e68dea42cf98f03dc8db29498a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1998.120888] nova-compute[62208]: DEBUG nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1998.146973] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1998.147219] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1998.147375] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1998.147546] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1998.147689] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1998.147831] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1998.148048] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1998.148231] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1998.148399] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1998.148562] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1998.148733] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1998.149589] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abd8cf9f-691f-4fb1-ab82-f045b46422c7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.154086] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1998.154086] nova-compute[62208]: warnings.warn( [ 1998.160207] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47bd9164-bd09-4f4b-9f44-9654f3a07e36 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.169995] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1998.169995] nova-compute[62208]: warnings.warn( [ 1998.206101] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-726ef93b-f8df-4a7b-bf02-3eb08e734fa4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.208496] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1998.208496] nova-compute[62208]: warnings.warn( [ 1998.213475] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d46ed44-35e3-4548-967c-e043850af4a9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.216598] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1998.216598] nova-compute[62208]: warnings.warn( [ 1998.243381] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00b38e9b-8288-4d9c-bf00-36c50cf5259c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.245810] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1998.245810] nova-compute[62208]: warnings.warn( [ 1998.251293] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87c2111f-d1db-4078-ae47-f8f863cf3f32 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.254966] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1998.254966] nova-compute[62208]: warnings.warn( [ 1998.264777] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1998.274307] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1998.292816] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.301s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Traceback (most recent call last): [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] result = getattr(controller, method)(*args, **kwargs) [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self._get(image_id) [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] resp, body = self.http_client.get(url, headers=header) [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 393, in get [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self.request(url, 'GET', **kwargs) [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self._handle_response(resp) [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise exc.from_response(resp, resp.content) [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] During handling of the above exception, another exception occurred: [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Traceback (most recent call last): [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self.driver.spawn(context, instance, image_meta, [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._fetch_image_if_missing(context, vi) [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] image_fetch(context, vi, tmp_image_ds_loc) [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] images.fetch_image( [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] metadata = IMAGE_API.get(context, image_ref) [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 1206, in get [ 1998.293552] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return session.show(context, image_id, [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] _reraise_translated_image_exception(image_id) [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 1032, in _reraise_translated_image_exception [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise new_exc.with_traceback(exc_trace) [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] result = getattr(controller, method)(*args, **kwargs) [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self._get(image_id) [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] resp, body = self.http_client.get(url, headers=header) [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 393, in get [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self.request(url, 'GET', **kwargs) [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self._handle_response(resp) [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise exc.from_response(resp, resp.content) [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] nova.exception.ImageNotAuthorized: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1998.294542] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.294542] nova-compute[62208]: DEBUG nova.compute.utils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1998.295715] nova-compute[62208]: DEBUG nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Build of instance ad00920b-3783-4c01-bb25-4f923d29dad7 was re-scheduled: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 1998.296241] nova-compute[62208]: DEBUG nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 1998.296423] nova-compute[62208]: DEBUG nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 1998.296582] nova-compute[62208]: DEBUG nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1998.296745] nova-compute[62208]: DEBUG nova.network.neutron [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1998.432024] nova-compute[62208]: DEBUG neutronclient.v2_0.client [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62208) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Traceback (most recent call last): [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] result = getattr(controller, method)(*args, **kwargs) [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self._get(image_id) [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] resp, body = self.http_client.get(url, headers=header) [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 393, in get [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self.request(url, 'GET', **kwargs) [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self._handle_response(resp) [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise exc.from_response(resp, resp.content) [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] During handling of the above exception, another exception occurred: [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Traceback (most recent call last): [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self.driver.spawn(context, instance, image_meta, [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._fetch_image_if_missing(context, vi) [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] image_fetch(context, vi, tmp_image_ds_loc) [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] images.fetch_image( [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] metadata = IMAGE_API.get(context, image_ref) [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 1206, in get [ 1998.433199] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return session.show(context, image_id, [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] _reraise_translated_image_exception(image_id) [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 1032, in _reraise_translated_image_exception [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise new_exc.with_traceback(exc_trace) [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] result = getattr(controller, method)(*args, **kwargs) [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self._get(image_id) [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] resp, body = self.http_client.get(url, headers=header) [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 393, in get [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self.request(url, 'GET', **kwargs) [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self._handle_response(resp) [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise exc.from_response(resp, resp.content) [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] nova.exception.ImageNotAuthorized: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] During handling of the above exception, another exception occurred: [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Traceback (most recent call last): [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 2447, in _do_build_and_run_instance [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._build_and_run_instance(context, instance, image, [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 2739, in _build_and_run_instance [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise exception.RescheduledException( [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] nova.exception.RescheduledException: Build of instance ad00920b-3783-4c01-bb25-4f923d29dad7 was re-scheduled: Not authorized for image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7. [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] During handling of the above exception, another exception occurred: [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Traceback (most recent call last): [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] ret = obj(*args, **kwargs) [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] exception_handler_v20(status_code, error_body) [ 1998.434243] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise client_exc(message=error_message, [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Neutron server returns request_ids: ['req-01d6187d-71cb-4f90-973b-41e82b27894c'] [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] During handling of the above exception, another exception occurred: [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Traceback (most recent call last): [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 3036, in _cleanup_allocated_networks [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._deallocate_network(context, instance, requested_networks) [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self.network_api.deallocate_for_instance( [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] data = neutron.list_ports(**search_opts) [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] ret = obj(*args, **kwargs) [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self.list('ports', self.ports_path, retrieve_all, [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] ret = obj(*args, **kwargs) [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] for r in self._pagination(collection, path, **params): [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] res = self.get(path, params=params) [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] ret = obj(*args, **kwargs) [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self.retry_request("GET", action, body=body, [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] ret = obj(*args, **kwargs) [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self.do_request(method, action, body=body, [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] ret = obj(*args, **kwargs) [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._handle_fault_response(status_code, replybody, resp) [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise exception.Unauthorized() [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] nova.exception.Unauthorized: Not authorized. [ 1998.435332] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.442024] nova-compute[62208]: DEBUG nova.network.neutron [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Successfully created port: 11d11885-444f-4ed9-afc6-47d9923e055d {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1998.500552] nova-compute[62208]: INFO nova.scheduler.client.report [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Deleted allocations for instance ad00920b-3783-4c01-bb25-4f923d29dad7 [ 1998.529542] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-eee7884c-0aaf-4edd-8f53-cb135349d06c tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "ad00920b-3783-4c01-bb25-4f923d29dad7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 591.066s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1998.530952] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "ad00920b-3783-4c01-bb25-4f923d29dad7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 395.948s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1998.531290] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Acquiring lock "ad00920b-3783-4c01-bb25-4f923d29dad7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1998.531686] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "ad00920b-3783-4c01-bb25-4f923d29dad7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1998.531957] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "ad00920b-3783-4c01-bb25-4f923d29dad7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1998.534245] nova-compute[62208]: INFO nova.compute.manager [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Terminating instance [ 1998.536487] nova-compute[62208]: DEBUG nova.compute.manager [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 1998.536687] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1998.537182] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9ec220cc-243b-4d88-8020-88d5b5082478 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.539954] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1998.539954] nova-compute[62208]: warnings.warn( [ 1998.547386] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b3e9c0f-2d5f-4e2a-979f-c9edecb789a6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.559123] nova-compute[62208]: DEBUG nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 1998.562211] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1998.562211] nova-compute[62208]: warnings.warn( [ 1998.585809] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ad00920b-3783-4c01-bb25-4f923d29dad7 could not be found. [ 1998.586065] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1998.586291] nova-compute[62208]: INFO nova.compute.manager [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1998.586555] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1998.586835] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 1998.586929] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1998.633406] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1998.633655] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1998.635701] nova-compute[62208]: INFO nova.compute.claims [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1998.738456] nova-compute[62208]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62208) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1998.738456] nova-compute[62208]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.<locals>._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-63a0ebb9-8b05-4cae-8ca1-7e0e1837677c'] [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1998.742047] nova-compute[62208]: ERROR oslo.service.loopingcall [ 1998.743331] nova-compute[62208]: ERROR nova.compute.manager [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Traceback (most recent call last): [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] ret = obj(*args, **kwargs) [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] exception_handler_v20(status_code, error_body) [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise client_exc(message=error_message, [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Neutron server returns request_ids: ['req-63a0ebb9-8b05-4cae-8ca1-7e0e1837677c'] [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] During handling of the above exception, another exception occurred: [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Traceback (most recent call last): [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 3332, in do_terminate_instance [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._delete_instance(context, instance, bdms) [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 3267, in _delete_instance [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._shutdown_instance(context, instance, bdms) [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 3161, in _shutdown_instance [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._try_deallocate_network(context, instance, requested_networks) [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 3075, in _try_deallocate_network [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] with excutils.save_and_reraise_exception(): [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self.force_reraise() [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise self.value [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 3073, in _try_deallocate_network [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] _deallocate_network_with_retries() [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return evt.wait() [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] result = hub.switch() [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self.greenlet.switch() [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] result = func(*self.args, **self.kw) [ 1998.774892] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] result = f(*args, **kwargs) [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._deallocate_network( [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self.network_api.deallocate_for_instance( [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] data = neutron.list_ports(**search_opts) [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] ret = obj(*args, **kwargs) [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self.list('ports', self.ports_path, retrieve_all, [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] ret = obj(*args, **kwargs) [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] for r in self._pagination(collection, path, **params): [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] res = self.get(path, params=params) [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] ret = obj(*args, **kwargs) [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self.retry_request("GET", action, body=body, [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] ret = obj(*args, **kwargs) [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] return self.do_request(method, action, body=body, [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] ret = obj(*args, **kwargs) [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] self._handle_fault_response(status_code, replybody, resp) [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1998.776150] nova-compute[62208]: ERROR nova.compute.manager [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] [ 1998.813380] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Lock "ad00920b-3783-4c01-bb25-4f923d29dad7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.282s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1998.814548] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "ad00920b-3783-4c01-bb25-4f923d29dad7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 199.408s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1998.814783] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] During sync_power_state the instance has a pending task (deleting). Skip. [ 1998.815002] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "ad00920b-3783-4c01-bb25-4f923d29dad7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1998.865719] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-771e6036-08de-4e9a-88f3-d31a0dc2df46 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.868542] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1998.868542] nova-compute[62208]: warnings.warn( [ 1998.874533] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e674b9dc-0157-4a91-8631-96cac7133334 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.877903] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1998.877903] nova-compute[62208]: warnings.warn( [ 1998.880348] nova-compute[62208]: INFO nova.compute.manager [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] [instance: ad00920b-3783-4c01-bb25-4f923d29dad7] Successfully reverted task state from None on failure for instance. [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server [None req-a6dfe3da-41c5-482b-910a-039d40f7a8a6 tempest-ServersTestMultiNic-118407991 tempest-ServersTestMultiNic-118407991-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-63a0ebb9-8b05-4cae-8ca1-7e0e1837677c'] [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1453, in decorated_function [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3344, in terminate_instance [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3339, in do_terminate_instance [ 1998.912401] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3332, in do_terminate_instance [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3267, in _delete_instance [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3161, in _shutdown_instance [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3075, in _try_deallocate_network [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise self.value [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3073, in _try_deallocate_network [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3062, in _deallocate_network_with_retries [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2282, in _deallocate_network [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1998.914159] nova-compute[62208]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1998.915684] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1998.915684] nova-compute[62208]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1998.915684] nova-compute[62208]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1998.915684] nova-compute[62208]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1998.915684] nova-compute[62208]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1998.915684] nova-compute[62208]: ERROR oslo_messaging.rpc.server [ 1998.915684] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-323dea72-f1b3-4a6d-9487-0f3978e706ce {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.915684] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1998.915684] nova-compute[62208]: warnings.warn( [ 1998.921552] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3499ca0b-4194-4292-a82e-3f14b51d5e49 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.927339] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1998.927339] nova-compute[62208]: warnings.warn( [ 1998.938064] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1998.950467] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1998.968179] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1998.968805] nova-compute[62208]: DEBUG nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 1999.011469] nova-compute[62208]: DEBUG nova.compute.utils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1999.013008] nova-compute[62208]: DEBUG nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1999.013237] nova-compute[62208]: DEBUG nova.network.neutron [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1999.025125] nova-compute[62208]: DEBUG nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 1999.074312] nova-compute[62208]: DEBUG nova.policy [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48cf6bc9785d46088589c14e7e8c14ca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '910bab22145d4f8cbd354ecf005eed6a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 1999.105969] nova-compute[62208]: DEBUG nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 1999.128285] nova-compute[62208]: DEBUG nova.virt.hardware [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1999.128668] nova-compute[62208]: DEBUG nova.virt.hardware [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1999.128886] nova-compute[62208]: DEBUG nova.virt.hardware [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1999.129133] nova-compute[62208]: DEBUG nova.virt.hardware [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1999.129341] nova-compute[62208]: DEBUG nova.virt.hardware [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1999.129545] nova-compute[62208]: DEBUG nova.virt.hardware [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1999.129817] nova-compute[62208]: DEBUG nova.virt.hardware [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1999.130038] nova-compute[62208]: DEBUG nova.virt.hardware [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1999.130290] nova-compute[62208]: DEBUG nova.virt.hardware [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1999.130519] nova-compute[62208]: DEBUG nova.virt.hardware [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1999.130769] nova-compute[62208]: DEBUG nova.virt.hardware [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1999.131789] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-673877dd-ce40-4cd0-86d9-86c5f5949ca4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1999.134506] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1999.134506] nova-compute[62208]: warnings.warn( [ 1999.141337] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1de5bf8b-e1d4-4176-a318-fd12bc0dead2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1999.145310] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1999.145310] nova-compute[62208]: warnings.warn( [ 1999.164489] nova-compute[62208]: DEBUG nova.network.neutron [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Successfully updated port: 11d11885-444f-4ed9-afc6-47d9923e055d {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1999.176277] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "refresh_cache-597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1999.176438] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "refresh_cache-597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1999.176591] nova-compute[62208]: DEBUG nova.network.neutron [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1999.264742] nova-compute[62208]: DEBUG nova.network.neutron [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1999.581533] nova-compute[62208]: DEBUG nova.compute.manager [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Received event network-vif-plugged-11d11885-444f-4ed9-afc6-47d9923e055d {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1999.581773] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] Acquiring lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1999.581981] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] Lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1999.582158] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] Lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1999.582332] nova-compute[62208]: DEBUG nova.compute.manager [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] No waiting events found dispatching network-vif-plugged-11d11885-444f-4ed9-afc6-47d9923e055d {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1999.582499] nova-compute[62208]: WARNING nova.compute.manager [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Received unexpected event network-vif-plugged-11d11885-444f-4ed9-afc6-47d9923e055d for instance with vm_state building and task_state spawning. [ 1999.582693] nova-compute[62208]: DEBUG nova.compute.manager [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Received event network-changed-11d11885-444f-4ed9-afc6-47d9923e055d {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 1999.582817] nova-compute[62208]: DEBUG nova.compute.manager [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Refreshing instance network info cache due to event network-changed-11d11885-444f-4ed9-afc6-47d9923e055d. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 1999.582982] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] Acquiring lock "refresh_cache-597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1999.831811] nova-compute[62208]: DEBUG nova.network.neutron [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Updating instance_info_cache with network_info: [{"id": "11d11885-444f-4ed9-afc6-47d9923e055d", "address": "fa:16:3e:37:3b:0a", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap11d11885-44", "ovs_interfaceid": "11d11885-444f-4ed9-afc6-47d9923e055d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1999.846352] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "refresh_cache-597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1999.846633] nova-compute[62208]: DEBUG nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Instance network_info: |[{"id": "11d11885-444f-4ed9-afc6-47d9923e055d", "address": "fa:16:3e:37:3b:0a", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap11d11885-44", "ovs_interfaceid": "11d11885-444f-4ed9-afc6-47d9923e055d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 1999.847449] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] Acquired lock "refresh_cache-597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1999.847712] nova-compute[62208]: DEBUG nova.network.neutron [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Refreshing network info cache for port 11d11885-444f-4ed9-afc6-47d9923e055d {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1999.848888] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:37:3b:0a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'da623279-b6f6-4570-8b15-a332120b8b60', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '11d11885-444f-4ed9-afc6-47d9923e055d', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1999.856340] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1999.857480] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1999.863046] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5a1310a5-3337-4a44-bee9-beb7dede44c5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1999.880665] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1999.880665] nova-compute[62208]: warnings.warn( [ 1999.886747] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1999.886747] nova-compute[62208]: value = "task-38658" [ 1999.886747] nova-compute[62208]: _type = "Task" [ 1999.886747] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1999.891358] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 1999.891358] nova-compute[62208]: warnings.warn( [ 1999.899623] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38658, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1999.983468] nova-compute[62208]: DEBUG nova.network.neutron [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Successfully created port: c54654ba-552f-4027-ae06-6e691dfdd033 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2000.390506] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2000.390506] nova-compute[62208]: warnings.warn( [ 2000.397163] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38658, 'name': CreateVM_Task, 'duration_secs': 0.349051} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2000.399782] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2000.400526] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2000.400778] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2000.404041] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-086ce911-839f-44a1-a045-0745bee95c46 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.415396] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2000.415396] nova-compute[62208]: warnings.warn( [ 2000.438782] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Reconfiguring VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2000.439248] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-b98d5912-c65a-491b-a4ab-a990b5349cd3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.452309] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2000.452309] nova-compute[62208]: warnings.warn( [ 2000.459057] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2000.459057] nova-compute[62208]: value = "task-38659" [ 2000.459057] nova-compute[62208]: _type = "Task" [ 2000.459057] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2000.462499] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2000.462499] nova-compute[62208]: warnings.warn( [ 2000.469062] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38659, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2000.478792] nova-compute[62208]: DEBUG nova.network.neutron [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Updated VIF entry in instance network info cache for port 11d11885-444f-4ed9-afc6-47d9923e055d. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2000.479245] nova-compute[62208]: DEBUG nova.network.neutron [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Updating instance_info_cache with network_info: [{"id": "11d11885-444f-4ed9-afc6-47d9923e055d", "address": "fa:16:3e:37:3b:0a", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap11d11885-44", "ovs_interfaceid": "11d11885-444f-4ed9-afc6-47d9923e055d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2000.490919] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-ffbb4270-2656-48b6-8e1d-a39198668d44 req-55492ed6-7b48-4bd2-bef6-8b1b67d3609f service nova] Releasing lock "refresh_cache-597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2000.964844] nova-compute[62208]: DEBUG nova.network.neutron [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Successfully updated port: c54654ba-552f-4027-ae06-6e691dfdd033 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2000.965891] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2000.965891] nova-compute[62208]: warnings.warn( [ 2000.972602] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38659, 'name': ReconfigVM_Task, 'duration_secs': 0.132566} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2000.973000] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Reconfigured VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2000.973255] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.572s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2000.973576] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2000.973764] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2000.974142] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2000.974459] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ad496c21-7068-4bce-8026-e413d0ed6fea {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.977514] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2000.977514] nova-compute[62208]: warnings.warn( [ 2000.981803] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "refresh_cache-ec568c91-b110-4c2a-8d62-8127c7781d03" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2000.982163] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquired lock "refresh_cache-ec568c91-b110-4c2a-8d62-8127c7781d03" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2000.982264] nova-compute[62208]: DEBUG nova.network.neutron [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2000.987221] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2000.987221] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5224d0af-4e95-efc7-e8fc-a5ac88644783" [ 2000.987221] nova-compute[62208]: _type = "Task" [ 2000.987221] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2000.999083] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2000.999083] nova-compute[62208]: warnings.warn( [ 2001.007425] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2001.007425] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2001.007425] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2001.035329] nova-compute[62208]: DEBUG nova.network.neutron [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2001.273072] nova-compute[62208]: DEBUG nova.network.neutron [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Updating instance_info_cache with network_info: [{"id": "c54654ba-552f-4027-ae06-6e691dfdd033", "address": "fa:16:3e:2c:e8:06", "network": {"id": "281fc297-5909-45d5-a98d-851fe6969333", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1363299703-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "910bab22145d4f8cbd354ecf005eed6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "10b81051-1eb1-406b-888c-4548c470c77e", "external-id": "nsx-vlan-transportzone-207", "segmentation_id": 207, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc54654ba-55", "ovs_interfaceid": "c54654ba-552f-4027-ae06-6e691dfdd033", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2001.289627] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Releasing lock "refresh_cache-ec568c91-b110-4c2a-8d62-8127c7781d03" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2001.289759] nova-compute[62208]: DEBUG nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Instance network_info: |[{"id": "c54654ba-552f-4027-ae06-6e691dfdd033", "address": "fa:16:3e:2c:e8:06", "network": {"id": "281fc297-5909-45d5-a98d-851fe6969333", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1363299703-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "910bab22145d4f8cbd354ecf005eed6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "10b81051-1eb1-406b-888c-4548c470c77e", "external-id": "nsx-vlan-transportzone-207", "segmentation_id": 207, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc54654ba-55", "ovs_interfaceid": "c54654ba-552f-4027-ae06-6e691dfdd033", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 2001.290509] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2c:e8:06', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '10b81051-1eb1-406b-888c-4548c470c77e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c54654ba-552f-4027-ae06-6e691dfdd033', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2001.299400] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2001.299998] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2001.300269] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-efbfae69-a074-4c79-8209-ba5b2e2610ee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.326873] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2001.326873] nova-compute[62208]: warnings.warn( [ 2001.333906] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2001.333906] nova-compute[62208]: value = "task-38660" [ 2001.333906] nova-compute[62208]: _type = "Task" [ 2001.333906] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2001.337471] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2001.337471] nova-compute[62208]: warnings.warn( [ 2001.344036] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38660, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2001.785222] nova-compute[62208]: DEBUG nova.compute.manager [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Received event network-vif-plugged-c54654ba-552f-4027-ae06-6e691dfdd033 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2001.785536] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] Acquiring lock "ec568c91-b110-4c2a-8d62-8127c7781d03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2001.785669] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] Lock "ec568c91-b110-4c2a-8d62-8127c7781d03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2001.785831] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] Lock "ec568c91-b110-4c2a-8d62-8127c7781d03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2001.785995] nova-compute[62208]: DEBUG nova.compute.manager [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] No waiting events found dispatching network-vif-plugged-c54654ba-552f-4027-ae06-6e691dfdd033 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2001.786311] nova-compute[62208]: WARNING nova.compute.manager [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Received unexpected event network-vif-plugged-c54654ba-552f-4027-ae06-6e691dfdd033 for instance with vm_state building and task_state spawning. [ 2001.786545] nova-compute[62208]: DEBUG nova.compute.manager [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Received event network-changed-c54654ba-552f-4027-ae06-6e691dfdd033 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2001.786712] nova-compute[62208]: DEBUG nova.compute.manager [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Refreshing instance network info cache due to event network-changed-c54654ba-552f-4027-ae06-6e691dfdd033. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 2001.786901] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] Acquiring lock "refresh_cache-ec568c91-b110-4c2a-8d62-8127c7781d03" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2001.787040] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] Acquired lock "refresh_cache-ec568c91-b110-4c2a-8d62-8127c7781d03" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2001.787200] nova-compute[62208]: DEBUG nova.network.neutron [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Refreshing network info cache for port c54654ba-552f-4027-ae06-6e691dfdd033 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2001.841102] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2001.841102] nova-compute[62208]: warnings.warn( [ 2001.846907] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38660, 'name': CreateVM_Task, 'duration_secs': 0.362736} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2001.847075] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2001.847657] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2001.847869] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2001.850717] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99a0cc71-3dff-4a8b-8ba6-9b129ca839cd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.861519] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2001.861519] nova-compute[62208]: warnings.warn( [ 2001.884847] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Reconfiguring VM instance to enable vnc on port - 5902 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2001.885468] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-44945900-fbdb-4d90-843d-c92a88c3a9f2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.896421] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2001.896421] nova-compute[62208]: warnings.warn( [ 2001.904533] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 2001.904533] nova-compute[62208]: value = "task-38661" [ 2001.904533] nova-compute[62208]: _type = "Task" [ 2001.904533] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2001.906050] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2001.906050] nova-compute[62208]: warnings.warn( [ 2001.911566] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38661, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2002.011704] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2002.012024] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2002.270959] nova-compute[62208]: DEBUG nova.network.neutron [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Updated VIF entry in instance network info cache for port c54654ba-552f-4027-ae06-6e691dfdd033. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2002.271341] nova-compute[62208]: DEBUG nova.network.neutron [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Updating instance_info_cache with network_info: [{"id": "c54654ba-552f-4027-ae06-6e691dfdd033", "address": "fa:16:3e:2c:e8:06", "network": {"id": "281fc297-5909-45d5-a98d-851fe6969333", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1363299703-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "910bab22145d4f8cbd354ecf005eed6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "10b81051-1eb1-406b-888c-4548c470c77e", "external-id": "nsx-vlan-transportzone-207", "segmentation_id": 207, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc54654ba-55", "ovs_interfaceid": "c54654ba-552f-4027-ae06-6e691dfdd033", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2002.285783] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-cbc4d558-93ab-4f08-8e78-795973ec1b9a req-06544066-534a-402a-97fa-b76df1e6dfe8 service nova] Releasing lock "refresh_cache-ec568c91-b110-4c2a-8d62-8127c7781d03" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2002.407692] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2002.407692] nova-compute[62208]: warnings.warn( [ 2002.414140] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38661, 'name': ReconfigVM_Task} progress is 99%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2002.908601] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2002.908601] nova-compute[62208]: warnings.warn( [ 2002.914785] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38661, 'name': ReconfigVM_Task} progress is 99%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2003.409057] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2003.409057] nova-compute[62208]: warnings.warn( [ 2003.414682] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38661, 'name': ReconfigVM_Task, 'duration_secs': 1.124456} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2003.414949] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Reconfigured VM instance to enable vnc on port - 5902 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2003.415160] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 1.567s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2003.415411] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2003.415536] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2003.415849] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2003.416144] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-91749fe0-8f3b-4d38-b3f7-a63ed550517d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.417670] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2003.417670] nova-compute[62208]: warnings.warn( [ 2003.421060] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 2003.421060] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f5bdb2-d5be-51df-a795-f9f24871065a" [ 2003.421060] nova-compute[62208]: _type = "Task" [ 2003.421060] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2003.423834] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2003.423834] nova-compute[62208]: warnings.warn( [ 2003.428617] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f5bdb2-d5be-51df-a795-f9f24871065a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2003.533946] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8317800a-0822-41b4-be8a-6c4f0746d285 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "911fbbcc-69d5-479f-87f1-2561fcb3dd6b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2003.925224] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2003.925224] nova-compute[62208]: warnings.warn( [ 2003.931440] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2003.931709] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2003.931922] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2029.141924] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2030.140837] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2033.142018] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2033.142378] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2033.142378] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2033.162683] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2033.162858] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2033.162968] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2033.163096] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2033.163220] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2033.163343] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2033.163462] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2033.163583] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2033.163701] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2033.163818] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2033.163937] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2035.142572] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2035.142572] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2035.142572] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2035.152350] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2035.152599] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2035.152745] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2035.152900] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2035.154002] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e144e1ba-9523-48e8-9f78-02814adaeb45 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.157157] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2035.157157] nova-compute[62208]: warnings.warn( [ 2035.163302] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-209f8f26-0161-4e59-a348-8e1ba98fbba9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.167063] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2035.167063] nova-compute[62208]: warnings.warn( [ 2035.178008] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a694637b-e6f9-4fa1-a9c4-02556da2dbb0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.180460] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2035.180460] nova-compute[62208]: warnings.warn( [ 2035.185173] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c0e679a-7538-4923-a808-e2ca0e31580b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.188104] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2035.188104] nova-compute[62208]: warnings.warn( [ 2035.214165] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181962MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2035.214317] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2035.214518] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2035.307725] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ca1b4fca-a4bb-4a37-8e88-45e103a3579f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2035.307894] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 68b1024d-2bfd-4999-9ba2-f2558c223885 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2035.308053] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2035.308226] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2035.308361] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2035.308486] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e0a444fc-dca2-419a-9ac1-8d71048e1690 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2035.308611] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 911fbbcc-69d5-479f-87f1-2561fcb3dd6b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2035.308736] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b797610-f460-461c-8c5a-1a28cf162c0e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2035.308856] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2035.308976] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec568c91-b110-4c2a-8d62-8127c7781d03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2035.320227] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b63cd2f-0b14-4008-b564-0078d3e0e20a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2035.320463] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2035.320610] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2035.341332] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing inventories for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2035.354498] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating ProviderTree inventory for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2035.354726] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating inventory in ProviderTree for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2035.370086] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing aggregate associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, aggregates: None {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2035.389791] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing trait associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2035.529493] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e72a5d6-5968-47d9-b2f6-c6533a0f93c5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.532110] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2035.532110] nova-compute[62208]: warnings.warn( [ 2035.537212] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-378d7295-c7c1-4a4c-815f-9b278b65f7d4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.540056] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2035.540056] nova-compute[62208]: warnings.warn( [ 2035.567018] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a8b8d1b-d984-4443-b1f6-5508f03075f1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.569536] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2035.569536] nova-compute[62208]: warnings.warn( [ 2035.575635] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e538690c-878e-426c-8792-cc605bb0e5fb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2035.579538] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2035.579538] nova-compute[62208]: warnings.warn( [ 2035.589643] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2035.599116] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2035.616115] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2035.616369] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.402s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2037.617221] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2038.135902] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2038.140579] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2038.140731] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances with incomplete migration {{(pid=62208) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11257}} [ 2039.149382] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2039.149671] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2039.149777] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2039.149908] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11219}} [ 2039.159276] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] There are 0 instances to clean {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11228}} [ 2047.101201] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2047.101201] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2047.101201] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2047.101201] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2047.101201] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2047.101201] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2047.101201] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2047.101201] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2047.101201] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2047.101201] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2047.101201] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2047.101201] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2047.101793] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/f757115c-1150-4f0a-9d94-773442ffd7dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2047.103635] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2047.103921] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Copying Virtual Disk [datastore2] vmware_temp/f757115c-1150-4f0a-9d94-773442ffd7dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/f757115c-1150-4f0a-9d94-773442ffd7dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2047.104257] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4e2b7702-cbbc-4a74-8e41-4af33b2ba91c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.106860] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2047.106860] nova-compute[62208]: warnings.warn( [ 2047.113070] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 2047.113070] nova-compute[62208]: value = "task-38662" [ 2047.113070] nova-compute[62208]: _type = "Task" [ 2047.113070] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2047.116268] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2047.116268] nova-compute[62208]: warnings.warn( [ 2047.121790] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38662, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2047.617355] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2047.617355] nova-compute[62208]: warnings.warn( [ 2047.624287] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2047.624632] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2047.625190] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Traceback (most recent call last): [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] yield resources [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] self.driver.spawn(context, instance, image_meta, [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] self._fetch_image_if_missing(context, vi) [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] image_cache(vi, tmp_image_ds_loc) [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] vm_util.copy_virtual_disk( [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] session._wait_for_task(vmdk_copy_task) [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] return self.wait_for_task(task_ref) [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] return evt.wait() [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] result = hub.switch() [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] return self.greenlet.switch() [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] self.f(*self.args, **self.kw) [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] raise exceptions.translate_fault(task_info.error) [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Faults: ['InvalidArgument'] [ 2047.625190] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] [ 2047.626130] nova-compute[62208]: INFO nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Terminating instance [ 2047.627778] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2047.627778] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2047.627778] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-18949b05-756f-40ff-b3bd-92b810705945 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.629928] nova-compute[62208]: DEBUG nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2047.630144] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2047.630907] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b37b874-7fad-4059-8792-47207da99cbf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.633706] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2047.633706] nova-compute[62208]: warnings.warn( [ 2047.634059] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2047.634059] nova-compute[62208]: warnings.warn( [ 2047.638646] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2047.638883] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2f92fe83-cd82-4db4-9ed8-09db21efa736 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.641217] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2047.641464] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2047.642165] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2047.642165] nova-compute[62208]: warnings.warn( [ 2047.642587] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-54032a2c-a8cd-49fb-a889-98f71201e40f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.644676] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2047.644676] nova-compute[62208]: warnings.warn( [ 2047.648661] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2047.648661] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f83bd9-96c3-be77-3c17-cf862d57b408" [ 2047.648661] nova-compute[62208]: _type = "Task" [ 2047.648661] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2047.651938] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2047.651938] nova-compute[62208]: warnings.warn( [ 2047.656926] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f83bd9-96c3-be77-3c17-cf862d57b408, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2047.713489] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2047.713739] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2047.713897] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Deleting the datastore file [datastore2] ca1b4fca-a4bb-4a37-8e88-45e103a3579f {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2047.714196] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-571f5281-78b2-484a-8f0e-de7a39b90178 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.716280] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2047.716280] nova-compute[62208]: warnings.warn( [ 2047.721115] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 2047.721115] nova-compute[62208]: value = "task-38664" [ 2047.721115] nova-compute[62208]: _type = "Task" [ 2047.721115] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2047.724156] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2047.724156] nova-compute[62208]: warnings.warn( [ 2047.729222] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38664, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2048.153457] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.153457] nova-compute[62208]: warnings.warn( [ 2048.159873] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2048.160154] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating directory with path [datastore2] vmware_temp/2d6718b4-1cb2-41a1-a7f5-2ca248b51680/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2048.160416] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5e1ae268-bf9e-4f76-81f8-6d7cb59ccf80 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.162429] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.162429] nova-compute[62208]: warnings.warn( [ 2048.173835] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Created directory with path [datastore2] vmware_temp/2d6718b4-1cb2-41a1-a7f5-2ca248b51680/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2048.174063] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Fetch image to [datastore2] vmware_temp/2d6718b4-1cb2-41a1-a7f5-2ca248b51680/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2048.174237] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/2d6718b4-1cb2-41a1-a7f5-2ca248b51680/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2048.175035] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d731bb7e-d903-45ca-81f7-d65987b3be68 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.177489] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.177489] nova-compute[62208]: warnings.warn( [ 2048.182431] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca68345c-c0a8-48e5-ad9b-049c57801bb9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.184706] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.184706] nova-compute[62208]: warnings.warn( [ 2048.192059] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9ea13cf-ae93-4d94-a369-3956310b5282 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.195601] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.195601] nova-compute[62208]: warnings.warn( [ 2048.227414] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e4fe7aa-77f9-4a50-8d06-c900346f054a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.229659] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.229659] nova-compute[62208]: warnings.warn( [ 2048.230050] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.230050] nova-compute[62208]: warnings.warn( [ 2048.235204] nova-compute[62208]: DEBUG oslo_vmware.api [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38664, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075934} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2048.236765] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2048.236963] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2048.237139] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2048.237318] nova-compute[62208]: INFO nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2048.239168] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3e6370a8-71c4-46ed-8a2a-eb47ef3050a3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.241120] nova-compute[62208]: DEBUG nova.compute.claims [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Aborting claim: <nova.compute.claims.Claim object at 0x7fb935946260> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2048.241298] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2048.241516] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2048.243844] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.243844] nova-compute[62208]: warnings.warn( [ 2048.264033] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2048.315708] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2d6718b4-1cb2-41a1-a7f5-2ca248b51680/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2048.372174] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2048.372421] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2d6718b4-1cb2-41a1-a7f5-2ca248b51680/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2048.462375] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eceaef4-35ed-4a91-8037-a426c725f42d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.466317] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.466317] nova-compute[62208]: warnings.warn( [ 2048.472282] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54632be8-83fb-4ed9-8e1b-73530647ad34 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.475465] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.475465] nova-compute[62208]: warnings.warn( [ 2048.503344] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b190ec4-2fb0-47b5-9a3a-b9865c828b5a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.505733] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.505733] nova-compute[62208]: warnings.warn( [ 2048.510999] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f30027ec-7e90-4e65-93e1-8304650f457e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.514681] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.514681] nova-compute[62208]: warnings.warn( [ 2048.524281] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2048.533629] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2048.551077] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.309s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2048.551655] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Traceback (most recent call last): [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] self.driver.spawn(context, instance, image_meta, [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] self._fetch_image_if_missing(context, vi) [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] image_cache(vi, tmp_image_ds_loc) [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] vm_util.copy_virtual_disk( [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] session._wait_for_task(vmdk_copy_task) [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] return self.wait_for_task(task_ref) [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] return evt.wait() [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] result = hub.switch() [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] return self.greenlet.switch() [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] self.f(*self.args, **self.kw) [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] raise exceptions.translate_fault(task_info.error) [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Faults: ['InvalidArgument'] [ 2048.551655] nova-compute[62208]: ERROR nova.compute.manager [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] [ 2048.552569] nova-compute[62208]: DEBUG nova.compute.utils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2048.554209] nova-compute[62208]: DEBUG nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Build of instance ca1b4fca-a4bb-4a37-8e88-45e103a3579f was re-scheduled: A specified parameter was not correct: fileType [ 2048.554209] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2048.554586] nova-compute[62208]: DEBUG nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2048.554762] nova-compute[62208]: DEBUG nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2048.554937] nova-compute[62208]: DEBUG nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2048.555125] nova-compute[62208]: DEBUG nova.network.neutron [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2048.833585] nova-compute[62208]: DEBUG nova.network.neutron [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2048.847854] nova-compute[62208]: INFO nova.compute.manager [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Took 0.29 seconds to deallocate network for instance. [ 2048.942831] nova-compute[62208]: INFO nova.scheduler.client.report [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Deleted allocations for instance ca1b4fca-a4bb-4a37-8e88-45e103a3579f [ 2048.968886] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e09ecc49-b0d9-4b9c-9218-55f4f26f82e6 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 588.076s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2048.970243] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10496d88-3d9e-446a-924e-fd4242b5d436 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 391.985s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2048.970450] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10496d88-3d9e-446a-924e-fd4242b5d436 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2048.970666] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10496d88-3d9e-446a-924e-fd4242b5d436 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2048.970837] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10496d88-3d9e-446a-924e-fd4242b5d436 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2048.972911] nova-compute[62208]: INFO nova.compute.manager [None req-10496d88-3d9e-446a-924e-fd4242b5d436 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Terminating instance [ 2048.974628] nova-compute[62208]: DEBUG nova.compute.manager [None req-10496d88-3d9e-446a-924e-fd4242b5d436 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2048.974867] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-10496d88-3d9e-446a-924e-fd4242b5d436 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2048.975403] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9dd87c1b-c1d0-4fe2-a7f6-41db0ca2508f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.977640] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.977640] nova-compute[62208]: warnings.warn( [ 2048.981150] nova-compute[62208]: DEBUG nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 2048.987438] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be2adb50-a31f-43af-a5c6-80decd3af5c4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.997833] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2048.997833] nova-compute[62208]: warnings.warn( [ 2049.015992] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-10496d88-3d9e-446a-924e-fd4242b5d436 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ca1b4fca-a4bb-4a37-8e88-45e103a3579f could not be found. [ 2049.016290] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-10496d88-3d9e-446a-924e-fd4242b5d436 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2049.016485] nova-compute[62208]: INFO nova.compute.manager [None req-10496d88-3d9e-446a-924e-fd4242b5d436 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2049.016752] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-10496d88-3d9e-446a-924e-fd4242b5d436 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2049.017043] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2049.017140] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2049.035191] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2049.035462] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2049.037001] nova-compute[62208]: INFO nova.compute.claims [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2049.047399] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2049.060243] nova-compute[62208]: INFO nova.compute.manager [-] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] Took 0.04 seconds to deallocate network for instance. [ 2049.155284] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-10496d88-3d9e-446a-924e-fd4242b5d436 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.185s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2049.156507] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 249.749s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2049.156807] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ca1b4fca-a4bb-4a37-8e88-45e103a3579f] During sync_power_state the instance has a pending task (deleting). Skip. [ 2049.157099] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "ca1b4fca-a4bb-4a37-8e88-45e103a3579f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2049.218073] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2395277-0861-4525-b21c-2ddf696a5339 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.220565] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2049.220565] nova-compute[62208]: warnings.warn( [ 2049.226294] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff68e510-577c-40f7-b9e6-6076beec4c53 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.229154] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2049.229154] nova-compute[62208]: warnings.warn( [ 2049.257116] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50df7a45-f5ae-4c20-817b-9a1e73221e9f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.259549] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2049.259549] nova-compute[62208]: warnings.warn( [ 2049.264858] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cae7dc51-f791-4e10-a4bb-4923d62f2c93 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.268599] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2049.268599] nova-compute[62208]: warnings.warn( [ 2049.278266] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2049.286585] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2049.307052] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.271s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2049.307544] nova-compute[62208]: DEBUG nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 2049.343582] nova-compute[62208]: DEBUG nova.compute.utils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2049.345397] nova-compute[62208]: DEBUG nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2049.345662] nova-compute[62208]: DEBUG nova.network.neutron [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2049.358617] nova-compute[62208]: DEBUG nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 2049.390242] nova-compute[62208]: DEBUG nova.policy [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e8cb00a6413b46fcb17cbe532a0bffc5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '53b578fa6aa34a2d80eb9938d58ffe12', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 2049.426708] nova-compute[62208]: DEBUG nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 2049.450419] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2049.450700] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2049.450861] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2049.451064] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2049.451344] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2049.451542] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2049.451782] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2049.451949] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2049.452160] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2049.452337] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2049.452584] nova-compute[62208]: DEBUG nova.virt.hardware [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2049.453581] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-199d5ea6-0c9e-4f7c-923a-94fc55c015e4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.456322] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2049.456322] nova-compute[62208]: warnings.warn( [ 2049.462282] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd34b6d5-e403-4c27-930b-27cb43855835 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.466486] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2049.466486] nova-compute[62208]: warnings.warn( [ 2049.630124] nova-compute[62208]: DEBUG nova.network.neutron [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Successfully created port: c2798cef-b133-459d-a9be-c6783db5ab38 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2050.275713] nova-compute[62208]: DEBUG nova.network.neutron [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Successfully updated port: c2798cef-b133-459d-a9be-c6783db5ab38 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2050.295413] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "refresh_cache-5b63cd2f-0b14-4008-b564-0078d3e0e20a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2050.295413] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "refresh_cache-5b63cd2f-0b14-4008-b564-0078d3e0e20a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2050.295413] nova-compute[62208]: DEBUG nova.network.neutron [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2050.336860] nova-compute[62208]: DEBUG nova.network.neutron [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2050.375739] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a20f2a07-0670-416f-a296-6c698267d81e tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "5b797610-f460-461c-8c5a-1a28cf162c0e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2050.564532] nova-compute[62208]: DEBUG nova.network.neutron [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Updating instance_info_cache with network_info: [{"id": "c2798cef-b133-459d-a9be-c6783db5ab38", "address": "fa:16:3e:4d:72:ee", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc2798cef-b1", "ovs_interfaceid": "c2798cef-b133-459d-a9be-c6783db5ab38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2050.578414] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "refresh_cache-5b63cd2f-0b14-4008-b564-0078d3e0e20a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2050.578728] nova-compute[62208]: DEBUG nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Instance network_info: |[{"id": "c2798cef-b133-459d-a9be-c6783db5ab38", "address": "fa:16:3e:4d:72:ee", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc2798cef-b1", "ovs_interfaceid": "c2798cef-b133-459d-a9be-c6783db5ab38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 2050.579434] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4d:72:ee', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4b5f9472-1844-4c99-8804-8f193cfff562', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c2798cef-b133-459d-a9be-c6783db5ab38', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2050.586833] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2050.587280] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2050.587516] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b2d63741-78fd-43c1-995c-ee635b31e067 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2050.603127] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2050.603127] nova-compute[62208]: warnings.warn( [ 2050.609363] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2050.609363] nova-compute[62208]: value = "task-38665" [ 2050.609363] nova-compute[62208]: _type = "Task" [ 2050.609363] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2050.612658] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2050.612658] nova-compute[62208]: warnings.warn( [ 2050.618167] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38665, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2050.872686] nova-compute[62208]: DEBUG nova.compute.manager [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Received event network-vif-plugged-c2798cef-b133-459d-a9be-c6783db5ab38 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2050.872986] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] Acquiring lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2050.873324] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] Lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2050.873587] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] Lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2050.873821] nova-compute[62208]: DEBUG nova.compute.manager [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] No waiting events found dispatching network-vif-plugged-c2798cef-b133-459d-a9be-c6783db5ab38 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2050.874016] nova-compute[62208]: WARNING nova.compute.manager [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Received unexpected event network-vif-plugged-c2798cef-b133-459d-a9be-c6783db5ab38 for instance with vm_state building and task_state spawning. [ 2050.874226] nova-compute[62208]: DEBUG nova.compute.manager [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Received event network-changed-c2798cef-b133-459d-a9be-c6783db5ab38 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2050.874434] nova-compute[62208]: DEBUG nova.compute.manager [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Refreshing instance network info cache due to event network-changed-c2798cef-b133-459d-a9be-c6783db5ab38. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 2050.874645] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] Acquiring lock "refresh_cache-5b63cd2f-0b14-4008-b564-0078d3e0e20a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2050.874826] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] Acquired lock "refresh_cache-5b63cd2f-0b14-4008-b564-0078d3e0e20a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2050.875039] nova-compute[62208]: DEBUG nova.network.neutron [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Refreshing network info cache for port c2798cef-b133-459d-a9be-c6783db5ab38 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2051.113708] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2051.113708] nova-compute[62208]: warnings.warn( [ 2051.119778] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38665, 'name': CreateVM_Task, 'duration_secs': 0.303206} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2051.120107] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2051.120850] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2051.121091] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2051.123995] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bb8cd91-cfde-4eeb-a9d9-83b0e65bc890 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.135060] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2051.135060] nova-compute[62208]: warnings.warn( [ 2051.158163] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Reconfiguring VM instance to enable vnc on port - 5903 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2051.158495] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-70002233-7d94-4166-9e5e-d47fef1a0999 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.168793] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2051.168793] nova-compute[62208]: warnings.warn( [ 2051.175023] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2051.175023] nova-compute[62208]: value = "task-38666" [ 2051.175023] nova-compute[62208]: _type = "Task" [ 2051.175023] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2051.178042] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2051.178042] nova-compute[62208]: warnings.warn( [ 2051.183771] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38666, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2051.323900] nova-compute[62208]: DEBUG nova.network.neutron [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Updated VIF entry in instance network info cache for port c2798cef-b133-459d-a9be-c6783db5ab38. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2051.324278] nova-compute[62208]: DEBUG nova.network.neutron [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Updating instance_info_cache with network_info: [{"id": "c2798cef-b133-459d-a9be-c6783db5ab38", "address": "fa:16:3e:4d:72:ee", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc2798cef-b1", "ovs_interfaceid": "c2798cef-b133-459d-a9be-c6783db5ab38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2051.334819] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-4d693e4c-480c-4c00-9a71-6f6faf3b9bcb req-48aec8e7-96ba-4a0b-afbb-e4b490c3cd86 service nova] Releasing lock "refresh_cache-5b63cd2f-0b14-4008-b564-0078d3e0e20a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2051.679560] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2051.679560] nova-compute[62208]: warnings.warn( [ 2051.685368] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38666, 'name': ReconfigVM_Task, 'duration_secs': 0.113277} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2051.685667] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Reconfigured VM instance to enable vnc on port - 5903 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2051.685843] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.565s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2051.686098] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2051.686252] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2051.686581] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2051.686818] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9f2fddd0-7c04-41ec-9642-c476f1e564b4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.688861] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2051.688861] nova-compute[62208]: warnings.warn( [ 2051.692017] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2051.692017] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e10cf7-5c3e-e74f-b66a-688ea8a0e019" [ 2051.692017] nova-compute[62208]: _type = "Task" [ 2051.692017] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2051.697547] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2051.697547] nova-compute[62208]: warnings.warn( [ 2051.702533] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e10cf7-5c3e-e74f-b66a-688ea8a0e019, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2052.196582] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2052.196582] nova-compute[62208]: warnings.warn( [ 2052.203083] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2052.203405] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2052.203620] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2056.141506] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2089.147944] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2091.141301] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2095.141316] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2095.141599] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2095.141599] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2095.162326] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2095.162503] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2095.162633] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2095.162727] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2095.162853] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2095.162976] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2095.163096] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2095.163216] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2095.163337] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2095.163458] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2095.163586] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2095.164087] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2096.141081] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2096.141326] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2096.151758] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2096.151997] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2096.152189] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2096.152348] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2096.153496] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff576ac7-f261-45fa-9b68-527d66c7f563 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2096.156651] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2096.156651] nova-compute[62208]: warnings.warn( [ 2096.162964] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39afb2b2-0af4-409e-b571-df5bc593eb48 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2096.167549] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2096.167549] nova-compute[62208]: warnings.warn( [ 2096.179247] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7370c3ce-39ff-46d2-89a8-1fa97ae31775 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2096.181591] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2096.181591] nova-compute[62208]: warnings.warn( [ 2096.186624] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-847ef0e1-23f2-4727-90a1-9726bdf3ddc4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2096.189851] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2096.189851] nova-compute[62208]: warnings.warn( [ 2096.217735] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181964MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2096.217901] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2096.218103] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2096.290479] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 68b1024d-2bfd-4999-9ba2-f2558c223885 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2096.290640] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2096.290766] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2096.290889] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2096.291010] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e0a444fc-dca2-419a-9ac1-8d71048e1690 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2096.291129] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 911fbbcc-69d5-479f-87f1-2561fcb3dd6b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2096.291250] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b797610-f460-461c-8c5a-1a28cf162c0e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2096.291366] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2096.291481] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec568c91-b110-4c2a-8d62-8127c7781d03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2096.291594] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b63cd2f-0b14-4008-b564-0078d3e0e20a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2096.291790] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2096.291925] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2096.427919] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8821f6d6-cfe7-4069-b7d1-f1be08a85e06 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2096.430825] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2096.430825] nova-compute[62208]: warnings.warn( [ 2096.435995] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-751300c4-4c88-4215-a48e-f1b870de43a6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2096.438915] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2096.438915] nova-compute[62208]: warnings.warn( [ 2096.465643] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0233408-1e9e-427a-af65-b2e91b5e8266 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2096.468082] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2096.468082] nova-compute[62208]: warnings.warn( [ 2096.473243] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a148b23d-915f-47e7-8a13-c6f3575189b9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2096.477000] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2096.477000] nova-compute[62208]: warnings.warn( [ 2096.487110] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2096.495134] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2096.512817] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2096.513009] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.295s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2097.122498] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2097.122498] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2097.122498] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2097.122498] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2097.122498] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2097.122498] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2097.122498] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2097.122498] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2097.122498] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2097.122498] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2097.122498] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2097.122498] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2097.123398] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/2d6718b4-1cb2-41a1-a7f5-2ca248b51680/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2097.125091] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2097.125374] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Copying Virtual Disk [datastore2] vmware_temp/2d6718b4-1cb2-41a1-a7f5-2ca248b51680/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/2d6718b4-1cb2-41a1-a7f5-2ca248b51680/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2097.125687] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-61d6f9af-5f6c-4e69-b588-0ac2f79ead0a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.127938] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2097.127938] nova-compute[62208]: warnings.warn( [ 2097.133618] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2097.133618] nova-compute[62208]: value = "task-38667" [ 2097.133618] nova-compute[62208]: _type = "Task" [ 2097.133618] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2097.136981] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2097.136981] nova-compute[62208]: warnings.warn( [ 2097.142345] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38667, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2097.513393] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2097.637601] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2097.637601] nova-compute[62208]: warnings.warn( [ 2097.644091] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2097.644408] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2097.644967] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Traceback (most recent call last): [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] yield resources [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] self.driver.spawn(context, instance, image_meta, [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] self._fetch_image_if_missing(context, vi) [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] image_cache(vi, tmp_image_ds_loc) [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] vm_util.copy_virtual_disk( [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] session._wait_for_task(vmdk_copy_task) [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] return self.wait_for_task(task_ref) [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] return evt.wait() [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] result = hub.switch() [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] return self.greenlet.switch() [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] self.f(*self.args, **self.kw) [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] raise exceptions.translate_fault(task_info.error) [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Faults: ['InvalidArgument'] [ 2097.644967] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] [ 2097.646595] nova-compute[62208]: INFO nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Terminating instance [ 2097.646859] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2097.647066] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2097.647311] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c8dd09a5-1c9a-4b0b-989e-71d7c40dad7c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.649958] nova-compute[62208]: DEBUG nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2097.650192] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2097.651032] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f7326ba-536a-418c-97c5-903df27d10da {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.653693] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2097.653693] nova-compute[62208]: warnings.warn( [ 2097.654087] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2097.654087] nova-compute[62208]: warnings.warn( [ 2097.658728] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2097.659979] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-98cba426-3c2e-48b7-a2f5-e777ccab455f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.661419] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2097.661584] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2097.662246] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9c03375a-45ee-4dbe-8e6f-2e5666911b01 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.664084] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2097.664084] nova-compute[62208]: warnings.warn( [ 2097.664381] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2097.664381] nova-compute[62208]: warnings.warn( [ 2097.667146] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 2097.667146] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52950eb2-de30-02dc-0fd9-c0202df2c65b" [ 2097.667146] nova-compute[62208]: _type = "Task" [ 2097.667146] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2097.671234] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2097.671234] nova-compute[62208]: warnings.warn( [ 2097.675963] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52950eb2-de30-02dc-0fd9-c0202df2c65b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2098.138228] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2098.138476] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2098.138667] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleting the datastore file [datastore2] 68b1024d-2bfd-4999-9ba2-f2558c223885 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2098.138943] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e46d980b-1376-4d8d-99c8-804a79bfe57f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.140997] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.140997] nova-compute[62208]: warnings.warn( [ 2098.146055] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2098.146055] nova-compute[62208]: value = "task-38669" [ 2098.146055] nova-compute[62208]: _type = "Task" [ 2098.146055] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2098.149237] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.149237] nova-compute[62208]: warnings.warn( [ 2098.154481] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38669, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2098.171089] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.171089] nova-compute[62208]: warnings.warn( [ 2098.177358] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2098.177614] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Creating directory with path [datastore2] vmware_temp/c9317ddf-96a1-4ad0-857f-330dc07a747c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2098.177863] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-22f372ed-eb38-4744-ade2-7bc006a6e9cb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.179834] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.179834] nova-compute[62208]: warnings.warn( [ 2098.197756] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Created directory with path [datastore2] vmware_temp/c9317ddf-96a1-4ad0-857f-330dc07a747c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2098.197991] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Fetch image to [datastore2] vmware_temp/c9317ddf-96a1-4ad0-857f-330dc07a747c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2098.198230] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/c9317ddf-96a1-4ad0-857f-330dc07a747c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2098.199098] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-172ff0d6-fc66-4654-94e7-39439a287272 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.201587] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.201587] nova-compute[62208]: warnings.warn( [ 2098.206426] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3bd03a2-8a8f-49a6-8685-3823778d43b9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.208692] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.208692] nova-compute[62208]: warnings.warn( [ 2098.215930] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de983ae1-228d-4291-9f71-c64bdf1ac62e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.220111] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.220111] nova-compute[62208]: warnings.warn( [ 2098.250360] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0975e407-42a0-4393-b425-e09f52a376d9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.252886] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.252886] nova-compute[62208]: warnings.warn( [ 2098.257134] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-002831af-7d68-4c93-8c51-dd5b8e3a5b92 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.258812] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.258812] nova-compute[62208]: warnings.warn( [ 2098.283203] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2098.332434] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c9317ddf-96a1-4ad0-857f-330dc07a747c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2098.391611] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2098.391826] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c9317ddf-96a1-4ad0-857f-330dc07a747c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2098.651555] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.651555] nova-compute[62208]: warnings.warn( [ 2098.657879] nova-compute[62208]: DEBUG oslo_vmware.api [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38669, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078408} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2098.657879] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2098.657879] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2098.657879] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2098.657879] nova-compute[62208]: INFO nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Took 1.01 seconds to destroy the instance on the hypervisor. [ 2098.660066] nova-compute[62208]: DEBUG nova.compute.claims [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Aborting claim: <nova.compute.claims.Claim object at 0x7fb935799300> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2098.660254] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2098.660487] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2098.833528] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f85dc0f-42eb-4893-89f7-7f9f9e1b2565 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.836499] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.836499] nova-compute[62208]: warnings.warn( [ 2098.841689] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4bafd61-f6fe-4272-8532-42b680059e09 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.844841] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.844841] nova-compute[62208]: warnings.warn( [ 2098.874133] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50339293-bb08-4df3-8a85-6fceb1c124c4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.876837] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.876837] nova-compute[62208]: warnings.warn( [ 2098.882379] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f40c8d1f-0ea6-4550-9240-9560c0bc5b6e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.886057] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2098.886057] nova-compute[62208]: warnings.warn( [ 2098.895494] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2098.904881] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2098.922120] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.261s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2098.922587] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Traceback (most recent call last): [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] self.driver.spawn(context, instance, image_meta, [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] self._fetch_image_if_missing(context, vi) [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] image_cache(vi, tmp_image_ds_loc) [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] vm_util.copy_virtual_disk( [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] session._wait_for_task(vmdk_copy_task) [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] return self.wait_for_task(task_ref) [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] return evt.wait() [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] result = hub.switch() [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] return self.greenlet.switch() [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] self.f(*self.args, **self.kw) [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] raise exceptions.translate_fault(task_info.error) [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Faults: ['InvalidArgument'] [ 2098.922587] nova-compute[62208]: ERROR nova.compute.manager [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] [ 2098.923458] nova-compute[62208]: DEBUG nova.compute.utils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2098.924824] nova-compute[62208]: DEBUG nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Build of instance 68b1024d-2bfd-4999-9ba2-f2558c223885 was re-scheduled: A specified parameter was not correct: fileType [ 2098.924824] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2098.925204] nova-compute[62208]: DEBUG nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2098.925385] nova-compute[62208]: DEBUG nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2098.925570] nova-compute[62208]: DEBUG nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2098.925728] nova-compute[62208]: DEBUG nova.network.neutron [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2099.135739] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2099.203546] nova-compute[62208]: DEBUG nova.network.neutron [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2099.219824] nova-compute[62208]: INFO nova.compute.manager [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Took 0.29 seconds to deallocate network for instance. [ 2099.322898] nova-compute[62208]: INFO nova.scheduler.client.report [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleted allocations for instance 68b1024d-2bfd-4999-9ba2-f2558c223885 [ 2099.346430] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ed669cb2-6bf5-4d0a-89c4-70f084799e4e tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "68b1024d-2bfd-4999-9ba2-f2558c223885" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 538.129s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2099.346825] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-480b7895-0275-4815-9d28-6f983125e91d tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "68b1024d-2bfd-4999-9ba2-f2558c223885" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 341.780s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2099.347145] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-480b7895-0275-4815-9d28-6f983125e91d tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "68b1024d-2bfd-4999-9ba2-f2558c223885-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2099.347485] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-480b7895-0275-4815-9d28-6f983125e91d tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "68b1024d-2bfd-4999-9ba2-f2558c223885-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2099.347783] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-480b7895-0275-4815-9d28-6f983125e91d tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "68b1024d-2bfd-4999-9ba2-f2558c223885-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2099.351377] nova-compute[62208]: INFO nova.compute.manager [None req-480b7895-0275-4815-9d28-6f983125e91d tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Terminating instance [ 2099.354029] nova-compute[62208]: DEBUG nova.compute.manager [None req-480b7895-0275-4815-9d28-6f983125e91d tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2099.354354] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-480b7895-0275-4815-9d28-6f983125e91d tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2099.354737] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-407e7b9a-0326-4c27-8c57-60b0767a8ac2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2099.357810] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2099.357810] nova-compute[62208]: warnings.warn( [ 2099.367119] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dffaaf06-8530-4e9c-8516-f3402eea9994 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2099.378769] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2099.378769] nova-compute[62208]: warnings.warn( [ 2099.399148] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-480b7895-0275-4815-9d28-6f983125e91d tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 68b1024d-2bfd-4999-9ba2-f2558c223885 could not be found. [ 2099.399662] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-480b7895-0275-4815-9d28-6f983125e91d tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2099.399855] nova-compute[62208]: INFO nova.compute.manager [None req-480b7895-0275-4815-9d28-6f983125e91d tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Took 0.05 seconds to destroy the instance on the hypervisor. [ 2099.400164] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-480b7895-0275-4815-9d28-6f983125e91d tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2099.400436] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2099.400539] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2099.428789] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2099.437902] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] Took 0.04 seconds to deallocate network for instance. [ 2099.535601] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-480b7895-0275-4815-9d28-6f983125e91d tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "68b1024d-2bfd-4999-9ba2-f2558c223885" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.189s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2099.536518] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "68b1024d-2bfd-4999-9ba2-f2558c223885" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 300.129s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2099.536709] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 68b1024d-2bfd-4999-9ba2-f2558c223885] During sync_power_state the instance has a pending task (deleting). Skip. [ 2099.536887] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "68b1024d-2bfd-4999-9ba2-f2558c223885" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2100.140503] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2100.140687] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2107.136625] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2145.010580] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2145.010580] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2145.010580] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2145.010580] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2145.010580] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2145.010580] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2145.010580] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2145.010580] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2145.010580] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2145.010580] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2145.010580] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2145.010580] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2145.011210] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/c9317ddf-96a1-4ad0-857f-330dc07a747c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2145.012972] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2145.013223] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Copying Virtual Disk [datastore2] vmware_temp/c9317ddf-96a1-4ad0-857f-330dc07a747c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/c9317ddf-96a1-4ad0-857f-330dc07a747c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2145.013522] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cacd0dd7-edc2-46e0-a246-83115975c74a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.015687] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2145.015687] nova-compute[62208]: warnings.warn( [ 2145.022969] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 2145.022969] nova-compute[62208]: value = "task-38670" [ 2145.022969] nova-compute[62208]: _type = "Task" [ 2145.022969] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2145.027505] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2145.027505] nova-compute[62208]: warnings.warn( [ 2145.033250] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38670, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2145.527424] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2145.527424] nova-compute[62208]: warnings.warn( [ 2145.533615] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2145.533905] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2145.534484] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Traceback (most recent call last): [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] yield resources [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] self.driver.spawn(context, instance, image_meta, [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] self._fetch_image_if_missing(context, vi) [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] image_cache(vi, tmp_image_ds_loc) [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] vm_util.copy_virtual_disk( [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] session._wait_for_task(vmdk_copy_task) [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] return self.wait_for_task(task_ref) [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] return evt.wait() [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] result = hub.switch() [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] return self.greenlet.switch() [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] self.f(*self.args, **self.kw) [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] raise exceptions.translate_fault(task_info.error) [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Faults: ['InvalidArgument'] [ 2145.534484] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] [ 2145.535409] nova-compute[62208]: INFO nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Terminating instance [ 2145.536388] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2145.536562] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2145.536803] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3fa00bbe-e5b6-46ab-b4b7-09cd831ad959 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.538966] nova-compute[62208]: DEBUG nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2145.539227] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2145.539897] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd5bf6ab-e749-4e43-9f87-a2e8fc9dc1ea {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.542373] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2145.542373] nova-compute[62208]: warnings.warn( [ 2145.542728] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2145.542728] nova-compute[62208]: warnings.warn( [ 2145.547063] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2145.548130] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ca01c5cd-b240-4687-8e04-52cb0e86f7a1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.549532] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2145.549710] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2145.550361] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-23d05b1d-37b4-413d-8d8e-bc57231787cb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.552307] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2145.552307] nova-compute[62208]: warnings.warn( [ 2145.552636] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2145.552636] nova-compute[62208]: warnings.warn( [ 2145.556197] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for the task: (returnval){ [ 2145.556197] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5292c11b-f211-10d3-a42e-1696d963db29" [ 2145.556197] nova-compute[62208]: _type = "Task" [ 2145.556197] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2145.560077] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2145.560077] nova-compute[62208]: warnings.warn( [ 2145.565040] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5292c11b-f211-10d3-a42e-1696d963db29, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2145.637171] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2145.637451] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2145.637695] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Deleting the datastore file [datastore2] c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2145.638003] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-61171013-aa5c-42f7-94ad-8afe4edfc530 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.639850] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2145.639850] nova-compute[62208]: warnings.warn( [ 2145.644710] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 2145.644710] nova-compute[62208]: value = "task-38672" [ 2145.644710] nova-compute[62208]: _type = "Task" [ 2145.644710] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2145.647819] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2145.647819] nova-compute[62208]: warnings.warn( [ 2145.653095] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38672, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2146.061052] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.061052] nova-compute[62208]: warnings.warn( [ 2146.066597] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2146.066859] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Creating directory with path [datastore2] vmware_temp/fce242e0-3372-4ebc-9f5f-dbe71fc14655/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2146.067100] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5bcd67f0-4aed-429a-ab89-24c1b2142f12 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.069012] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.069012] nova-compute[62208]: warnings.warn( [ 2146.079719] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Created directory with path [datastore2] vmware_temp/fce242e0-3372-4ebc-9f5f-dbe71fc14655/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2146.079981] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Fetch image to [datastore2] vmware_temp/fce242e0-3372-4ebc-9f5f-dbe71fc14655/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2146.080124] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/fce242e0-3372-4ebc-9f5f-dbe71fc14655/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2146.080950] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8b7af66-0fbc-4416-abad-3554a4d20186 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.083379] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.083379] nova-compute[62208]: warnings.warn( [ 2146.088075] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c52b3a9c-2aff-45da-9ac0-5bd44cb4bd9c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.090411] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.090411] nova-compute[62208]: warnings.warn( [ 2146.097432] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a671b85-faa1-4989-93fb-5f30bc746c78 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.101132] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.101132] nova-compute[62208]: warnings.warn( [ 2146.127482] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-539079c6-9772-4f60-8a9d-3e7783907a9a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.130032] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.130032] nova-compute[62208]: warnings.warn( [ 2146.133999] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-60ae4e83-0b10-496b-885d-6f0913ced9b6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.135629] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.135629] nova-compute[62208]: warnings.warn( [ 2146.149565] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.149565] nova-compute[62208]: warnings.warn( [ 2146.154823] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38672, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069865} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2146.156190] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2146.156385] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2146.156559] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2146.156747] nova-compute[62208]: INFO nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2146.158538] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2146.160793] nova-compute[62208]: DEBUG nova.compute.claims [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Aborting claim: <nova.compute.claims.Claim object at 0x7fb935567460> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2146.160966] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2146.161180] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2146.249915] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fce242e0-3372-4ebc-9f5f-dbe71fc14655/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2146.306391] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2146.306587] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fce242e0-3372-4ebc-9f5f-dbe71fc14655/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2146.371463] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-779ec8a3-42e6-489c-b4fd-c79e8bfad6eb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.374111] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.374111] nova-compute[62208]: warnings.warn( [ 2146.379987] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f89f66db-a771-45be-9aff-28a10625419a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.382928] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.382928] nova-compute[62208]: warnings.warn( [ 2146.409754] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c32cc93-be1d-4833-b9e2-de846c687870 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.412100] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.412100] nova-compute[62208]: warnings.warn( [ 2146.417571] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc32d97c-4030-4bcc-8ad9-ec002bf52449 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.421261] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.421261] nova-compute[62208]: warnings.warn( [ 2146.431110] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2146.440219] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2146.457108] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.296s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2146.457580] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Traceback (most recent call last): [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] self.driver.spawn(context, instance, image_meta, [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] self._fetch_image_if_missing(context, vi) [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] image_cache(vi, tmp_image_ds_loc) [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] vm_util.copy_virtual_disk( [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] session._wait_for_task(vmdk_copy_task) [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] return self.wait_for_task(task_ref) [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] return evt.wait() [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] result = hub.switch() [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] return self.greenlet.switch() [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] self.f(*self.args, **self.kw) [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] raise exceptions.translate_fault(task_info.error) [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Faults: ['InvalidArgument'] [ 2146.457580] nova-compute[62208]: ERROR nova.compute.manager [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] [ 2146.458417] nova-compute[62208]: DEBUG nova.compute.utils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2146.459702] nova-compute[62208]: DEBUG nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Build of instance c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b was re-scheduled: A specified parameter was not correct: fileType [ 2146.459702] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2146.460114] nova-compute[62208]: DEBUG nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2146.460322] nova-compute[62208]: DEBUG nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2146.460509] nova-compute[62208]: DEBUG nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2146.460678] nova-compute[62208]: DEBUG nova.network.neutron [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2146.699926] nova-compute[62208]: DEBUG nova.network.neutron [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2146.738548] nova-compute[62208]: INFO nova.compute.manager [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Took 0.28 seconds to deallocate network for instance. [ 2146.838971] nova-compute[62208]: INFO nova.scheduler.client.report [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Deleted allocations for instance c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b [ 2146.856875] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7b623e03-edc8-41f3-8e5d-e96c9e2a1a33 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 550.323s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2146.857149] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7c0d7baf-e1ca-4808-ae40-df28a2beec42 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 354.472s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2146.857370] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7c0d7baf-e1ca-4808-ae40-df28a2beec42 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2146.857574] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7c0d7baf-e1ca-4808-ae40-df28a2beec42 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2146.857836] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7c0d7baf-e1ca-4808-ae40-df28a2beec42 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2146.859724] nova-compute[62208]: INFO nova.compute.manager [None req-7c0d7baf-e1ca-4808-ae40-df28a2beec42 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Terminating instance [ 2146.863240] nova-compute[62208]: DEBUG nova.compute.manager [None req-7c0d7baf-e1ca-4808-ae40-df28a2beec42 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2146.863486] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7c0d7baf-e1ca-4808-ae40-df28a2beec42 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2146.863824] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-decb10bb-4e0b-401b-aa7a-754884d05b75 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.865763] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.865763] nova-compute[62208]: warnings.warn( [ 2146.873575] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5454794a-f238-4689-b218-653101d028e5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.884628] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2146.884628] nova-compute[62208]: warnings.warn( [ 2146.901980] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-7c0d7baf-e1ca-4808-ae40-df28a2beec42 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b could not be found. [ 2146.902212] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7c0d7baf-e1ca-4808-ae40-df28a2beec42 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2146.902398] nova-compute[62208]: INFO nova.compute.manager [None req-7c0d7baf-e1ca-4808-ae40-df28a2beec42 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2146.902651] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-7c0d7baf-e1ca-4808-ae40-df28a2beec42 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2146.902887] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2146.902983] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2146.930787] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2146.942971] nova-compute[62208]: INFO nova.compute.manager [-] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] Took 0.04 seconds to deallocate network for instance. [ 2147.077892] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7c0d7baf-e1ca-4808-ae40-df28a2beec42 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.221s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2147.078881] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 347.671s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2147.079433] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b] During sync_power_state the instance has a pending task (deleting). Skip. [ 2147.079699] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "c488788f-bf8a-45cf-97a5-0f5a7cf0ba0b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2149.141725] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2149.820324] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ce8faadd-07d0-4741-b532-c61f5a060a57 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2153.142208] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2156.142134] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2156.142401] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2156.142441] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2156.161086] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2156.161325] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2156.161432] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2156.161568] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2156.161696] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2156.161820] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2156.162163] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2156.162313] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2156.162442] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2157.140619] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2157.140866] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2157.141033] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2157.151092] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2157.151377] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2157.151416] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2157.151551] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2157.152767] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afc896b6-a730-4472-9e32-85adc93397ed {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.155620] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2157.155620] nova-compute[62208]: warnings.warn( [ 2157.162767] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-146555d1-3ae6-4376-b3f6-f2154c2e462c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.166342] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2157.166342] nova-compute[62208]: warnings.warn( [ 2157.177003] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0dbd9b7-ba20-43ad-9593-9c479a587e1b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.179188] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2157.179188] nova-compute[62208]: warnings.warn( [ 2157.183577] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9487584-5a14-49c2-bb2e-17b6a2649402 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.187470] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2157.187470] nova-compute[62208]: warnings.warn( [ 2157.214528] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181961MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2157.214685] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2157.214884] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2157.273479] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2157.273646] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2157.273804] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e0a444fc-dca2-419a-9ac1-8d71048e1690 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2157.273930] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 911fbbcc-69d5-479f-87f1-2561fcb3dd6b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2157.274050] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b797610-f460-461c-8c5a-1a28cf162c0e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2157.274168] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2157.274378] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec568c91-b110-4c2a-8d62-8127c7781d03 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2157.274515] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b63cd2f-0b14-4008-b564-0078d3e0e20a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2157.274701] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2157.274838] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2157.378488] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f16ffbea-a2d5-46ed-b4ef-572daa5f2c35 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.381034] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2157.381034] nova-compute[62208]: warnings.warn( [ 2157.386748] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33e89bd2-0d3a-4a34-89b5-00552c0b778d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.389705] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2157.389705] nova-compute[62208]: warnings.warn( [ 2157.417410] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64fefde2-5612-4a7e-8bf0-2c91cedcc73e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.419904] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2157.419904] nova-compute[62208]: warnings.warn( [ 2157.425071] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f228606f-e6d4-401d-9922-4c596bfdf011 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.428678] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2157.428678] nova-compute[62208]: warnings.warn( [ 2157.438155] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2157.446127] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2157.462562] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2157.462562] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2158.463702] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2160.136326] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2160.141017] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2160.141188] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2187.208716] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0f08719d-7f89-416b-b573-42d34aa89625 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "ec568c91-b110-4c2a-8d62-8127c7781d03" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2191.362359] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2191.362359] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2191.362359] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2191.362359] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2191.362359] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2191.362359] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2191.362359] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2191.362359] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2191.362359] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2191.362359] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2191.362359] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2191.362359] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2191.363009] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/fce242e0-3372-4ebc-9f5f-dbe71fc14655/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2191.364610] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2191.364864] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Copying Virtual Disk [datastore2] vmware_temp/fce242e0-3372-4ebc-9f5f-dbe71fc14655/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/fce242e0-3372-4ebc-9f5f-dbe71fc14655/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2191.365160] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3fcca0d7-4c6b-4918-8894-9ad8ad6ed2aa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2191.367647] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2191.367647] nova-compute[62208]: warnings.warn( [ 2191.374324] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for the task: (returnval){ [ 2191.374324] nova-compute[62208]: value = "task-38673" [ 2191.374324] nova-compute[62208]: _type = "Task" [ 2191.374324] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2191.377464] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2191.377464] nova-compute[62208]: warnings.warn( [ 2191.382689] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': task-38673, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2191.879764] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2191.879764] nova-compute[62208]: warnings.warn( [ 2191.885560] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2191.885856] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2191.886457] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Traceback (most recent call last): [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] yield resources [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] self.driver.spawn(context, instance, image_meta, [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] self._fetch_image_if_missing(context, vi) [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] image_cache(vi, tmp_image_ds_loc) [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] vm_util.copy_virtual_disk( [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] session._wait_for_task(vmdk_copy_task) [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] return self.wait_for_task(task_ref) [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] return evt.wait() [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] result = hub.switch() [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] return self.greenlet.switch() [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] self.f(*self.args, **self.kw) [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] raise exceptions.translate_fault(task_info.error) [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Faults: ['InvalidArgument'] [ 2191.886457] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] [ 2191.887431] nova-compute[62208]: INFO nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Terminating instance [ 2191.888362] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2191.888582] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2191.888827] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-11689d4a-8a49-4115-987a-046f54e5a556 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2191.891048] nova-compute[62208]: DEBUG nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2191.891285] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2191.891984] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2eb03d57-a89d-4a43-afc7-173da67a0fdd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2191.894430] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2191.894430] nova-compute[62208]: warnings.warn( [ 2191.894761] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2191.894761] nova-compute[62208]: warnings.warn( [ 2191.898953] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2191.899185] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e9853e28-6e9c-414e-bfab-de6e53267975 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2191.901410] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2191.901583] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2191.902194] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2191.902194] nova-compute[62208]: warnings.warn( [ 2191.902658] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f82cb33c-bd41-4d4c-b456-aea99de70a64 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2191.904635] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2191.904635] nova-compute[62208]: warnings.warn( [ 2191.907358] nova-compute[62208]: DEBUG oslo_vmware.api [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2191.907358] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52226a48-754b-1a32-6dfa-7c83c18c01d7" [ 2191.907358] nova-compute[62208]: _type = "Task" [ 2191.907358] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2191.911777] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2191.911777] nova-compute[62208]: warnings.warn( [ 2191.917987] nova-compute[62208]: DEBUG oslo_vmware.api [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52226a48-754b-1a32-6dfa-7c83c18c01d7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2191.979090] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2191.979352] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2191.979548] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Deleting the datastore file [datastore2] 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2191.979826] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e0f27d52-1839-45c5-a222-23341ce66345 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2191.981655] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2191.981655] nova-compute[62208]: warnings.warn( [ 2191.987566] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for the task: (returnval){ [ 2191.987566] nova-compute[62208]: value = "task-38675" [ 2191.987566] nova-compute[62208]: _type = "Task" [ 2191.987566] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2191.990674] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2191.990674] nova-compute[62208]: warnings.warn( [ 2191.995636] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': task-38675, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2192.412539] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2192.412539] nova-compute[62208]: warnings.warn( [ 2192.418709] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2192.418975] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating directory with path [datastore2] vmware_temp/3616f08a-8bfe-4d8e-a022-9300940f45b7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2192.419237] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bc18edd1-fb50-420a-b11a-e90e12510006 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.421230] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2192.421230] nova-compute[62208]: warnings.warn( [ 2192.430948] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created directory with path [datastore2] vmware_temp/3616f08a-8bfe-4d8e-a022-9300940f45b7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2192.431141] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Fetch image to [datastore2] vmware_temp/3616f08a-8bfe-4d8e-a022-9300940f45b7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2192.431315] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/3616f08a-8bfe-4d8e-a022-9300940f45b7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2192.432080] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75b00427-2961-417f-a90a-55d75dbb30c1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.434363] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2192.434363] nova-compute[62208]: warnings.warn( [ 2192.438578] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1c45758-3477-44ba-b723-7348b4c609f9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.440731] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2192.440731] nova-compute[62208]: warnings.warn( [ 2192.447394] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d76f47ca-3676-4b68-b4de-e056d6a9652f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.450900] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2192.450900] nova-compute[62208]: warnings.warn( [ 2192.477312] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59ea6e79-aa5b-4b77-aa57-3b6c75d72bfa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.479927] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2192.479927] nova-compute[62208]: warnings.warn( [ 2192.483898] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7cd1ff07-78bf-4e6f-b42d-369f42928496 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.485474] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2192.485474] nova-compute[62208]: warnings.warn( [ 2192.490703] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2192.490703] nova-compute[62208]: warnings.warn( [ 2192.495851] nova-compute[62208]: DEBUG oslo_vmware.api [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Task: {'id': task-38675, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075317} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2192.496114] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2192.496295] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2192.496470] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2192.496645] nova-compute[62208]: INFO nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2192.498823] nova-compute[62208]: DEBUG nova.compute.claims [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Aborting claim: <nova.compute.claims.Claim object at 0x7fb935632440> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2192.499001] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2192.499244] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2192.503266] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2192.553647] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3616f08a-8bfe-4d8e-a022-9300940f45b7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2192.610548] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2192.610803] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3616f08a-8bfe-4d8e-a022-9300940f45b7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2192.695680] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b41d07e-ef87-43e7-87f7-06c09895f063 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.698168] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2192.698168] nova-compute[62208]: warnings.warn( [ 2192.703317] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e7c9ad7-7b34-479d-b4dc-896d449774e9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.706783] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2192.706783] nova-compute[62208]: warnings.warn( [ 2192.739285] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72b0bcb1-dde6-4a14-a292-858cc99a3341 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.741826] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2192.741826] nova-compute[62208]: warnings.warn( [ 2192.747183] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee60e25c-2b38-4d84-894f-4a69ac4d5e94 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.751033] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2192.751033] nova-compute[62208]: warnings.warn( [ 2192.761271] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2192.770362] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2192.787500] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.288s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2192.788380] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Traceback (most recent call last): [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] self.driver.spawn(context, instance, image_meta, [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] self._fetch_image_if_missing(context, vi) [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] image_cache(vi, tmp_image_ds_loc) [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] vm_util.copy_virtual_disk( [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] session._wait_for_task(vmdk_copy_task) [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] return self.wait_for_task(task_ref) [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] return evt.wait() [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] result = hub.switch() [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] return self.greenlet.switch() [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] self.f(*self.args, **self.kw) [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] raise exceptions.translate_fault(task_info.error) [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Faults: ['InvalidArgument'] [ 2192.788380] nova-compute[62208]: ERROR nova.compute.manager [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] [ 2192.789430] nova-compute[62208]: DEBUG nova.compute.utils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2192.790298] nova-compute[62208]: DEBUG nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Build of instance 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 was re-scheduled: A specified parameter was not correct: fileType [ 2192.790298] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2192.790704] nova-compute[62208]: DEBUG nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2192.790882] nova-compute[62208]: DEBUG nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2192.791055] nova-compute[62208]: DEBUG nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2192.791219] nova-compute[62208]: DEBUG nova.network.neutron [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2193.078454] nova-compute[62208]: DEBUG nova.network.neutron [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2193.094317] nova-compute[62208]: INFO nova.compute.manager [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Took 0.30 seconds to deallocate network for instance. [ 2193.188443] nova-compute[62208]: INFO nova.scheduler.client.report [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Deleted allocations for instance 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 [ 2193.208461] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a0ad1f9a-2301-4990-ae21-654f6d1df540 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 585.777s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2193.209326] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 393.801s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2193.209677] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] During sync_power_state the instance has a pending task (spawning). Skip. [ 2193.209983] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2193.210591] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-67de301c-4471-4723-af48-e7273a8741f4 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 390.159s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2193.211368] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-67de301c-4471-4723-af48-e7273a8741f4 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Acquiring lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2193.211731] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-67de301c-4471-4723-af48-e7273a8741f4 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2193.212448] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-67de301c-4471-4723-af48-e7273a8741f4 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2193.215202] nova-compute[62208]: INFO nova.compute.manager [None req-67de301c-4471-4723-af48-e7273a8741f4 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Terminating instance [ 2193.217736] nova-compute[62208]: DEBUG nova.compute.manager [None req-67de301c-4471-4723-af48-e7273a8741f4 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2193.218061] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-67de301c-4471-4723-af48-e7273a8741f4 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2193.218461] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ef387ca1-5800-4b13-869c-6ef20e7d9063 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.221074] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2193.221074] nova-compute[62208]: warnings.warn( [ 2193.229094] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8cc4b13-dc0f-45b5-8f43-43e6ec645677 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.239774] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2193.239774] nova-compute[62208]: warnings.warn( [ 2193.257344] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-67de301c-4471-4723-af48-e7273a8741f4 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9 could not be found. [ 2193.257540] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-67de301c-4471-4723-af48-e7273a8741f4 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2193.257723] nova-compute[62208]: INFO nova.compute.manager [None req-67de301c-4471-4723-af48-e7273a8741f4 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2193.257978] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-67de301c-4471-4723-af48-e7273a8741f4 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2193.258286] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2193.258387] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2193.283226] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2193.291818] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 70e11e84-4f41-4a7d-a53e-ed0b949a9fe9] Took 0.03 seconds to deallocate network for instance. [ 2193.380458] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-67de301c-4471-4723-af48-e7273a8741f4 tempest-SecurityGroupsTestJSON-1742011264 tempest-SecurityGroupsTestJSON-1742011264-project-member] Lock "70e11e84-4f41-4a7d-a53e-ed0b949a9fe9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.170s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2198.454259] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5f7f6285-4817-4ae3-b867-31c36ba043f5 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2205.152884] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Acquiring lock "91c64da9-f295-4e84-a8bd-149a72a239da" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2205.153219] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Lock "91c64da9-f295-4e84-a8bd-149a72a239da" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2205.165206] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 2205.223310] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2205.223310] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2205.225199] nova-compute[62208]: INFO nova.compute.claims [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2205.383075] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fe49ae1-3ca8-400a-820d-56732c964055 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.385701] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2205.385701] nova-compute[62208]: warnings.warn( [ 2205.390846] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c9e7289-2908-449f-9cef-cdecfc2d9f50 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.395608] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2205.395608] nova-compute[62208]: warnings.warn( [ 2205.423139] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98df254a-6f0c-4660-a0e2-5c35570df046 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.425586] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2205.425586] nova-compute[62208]: warnings.warn( [ 2205.431105] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6ecc6fc-612f-4b65-8770-ce074ddb2242 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.435872] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2205.435872] nova-compute[62208]: warnings.warn( [ 2205.445432] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2205.456030] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2205.472878] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2205.473385] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 2205.510171] nova-compute[62208]: DEBUG nova.compute.utils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2205.511953] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2205.512031] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2205.525647] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 2205.554475] nova-compute[62208]: DEBUG nova.policy [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ecd57f39a98a4771a6c58a6120d387e8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32db76789e284dffa4b1b13768550e15', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 2205.596036] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 2205.620544] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2205.620831] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2205.620994] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2205.621180] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2205.621365] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2205.621546] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2205.621782] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2205.621948] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2205.622122] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2205.622287] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2205.622463] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2205.623355] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-039e9772-c70c-4706-97e5-6ea5123ff738 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.626014] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2205.626014] nova-compute[62208]: warnings.warn( [ 2205.632271] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86f4111b-0288-4352-a3de-be5208b7f914 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.636091] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2205.636091] nova-compute[62208]: warnings.warn( [ 2205.824673] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Successfully created port: 49e510d7-6108-401c-a08e-2fc03655e226 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2206.317934] nova-compute[62208]: DEBUG nova.compute.manager [req-75b78ec7-286a-400c-b005-3306ffd8bc19 req-88fee8c9-7fbc-4153-9e6e-3c748f3b4e7a service nova] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Received event network-vif-plugged-49e510d7-6108-401c-a08e-2fc03655e226 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2206.318234] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-75b78ec7-286a-400c-b005-3306ffd8bc19 req-88fee8c9-7fbc-4153-9e6e-3c748f3b4e7a service nova] Acquiring lock "91c64da9-f295-4e84-a8bd-149a72a239da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2206.318437] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-75b78ec7-286a-400c-b005-3306ffd8bc19 req-88fee8c9-7fbc-4153-9e6e-3c748f3b4e7a service nova] Lock "91c64da9-f295-4e84-a8bd-149a72a239da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2206.318612] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-75b78ec7-286a-400c-b005-3306ffd8bc19 req-88fee8c9-7fbc-4153-9e6e-3c748f3b4e7a service nova] Lock "91c64da9-f295-4e84-a8bd-149a72a239da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2206.318775] nova-compute[62208]: DEBUG nova.compute.manager [req-75b78ec7-286a-400c-b005-3306ffd8bc19 req-88fee8c9-7fbc-4153-9e6e-3c748f3b4e7a service nova] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] No waiting events found dispatching network-vif-plugged-49e510d7-6108-401c-a08e-2fc03655e226 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2206.318934] nova-compute[62208]: WARNING nova.compute.manager [req-75b78ec7-286a-400c-b005-3306ffd8bc19 req-88fee8c9-7fbc-4153-9e6e-3c748f3b4e7a service nova] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Received unexpected event network-vif-plugged-49e510d7-6108-401c-a08e-2fc03655e226 for instance with vm_state building and task_state spawning. [ 2206.427730] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Successfully updated port: 49e510d7-6108-401c-a08e-2fc03655e226 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2206.441868] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Acquiring lock "refresh_cache-91c64da9-f295-4e84-a8bd-149a72a239da" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2206.442205] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Acquired lock "refresh_cache-91c64da9-f295-4e84-a8bd-149a72a239da" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2206.442205] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2206.487025] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2206.686980] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Updating instance_info_cache with network_info: [{"id": "49e510d7-6108-401c-a08e-2fc03655e226", "address": "fa:16:3e:78:46:ba", "network": {"id": "dbab563d-60c4-40bc-bce1-206a3a45dd33", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1634205887-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "32db76789e284dffa4b1b13768550e15", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap49e510d7-61", "ovs_interfaceid": "49e510d7-6108-401c-a08e-2fc03655e226", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2206.703208] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Releasing lock "refresh_cache-91c64da9-f295-4e84-a8bd-149a72a239da" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2206.703719] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Instance network_info: |[{"id": "49e510d7-6108-401c-a08e-2fc03655e226", "address": "fa:16:3e:78:46:ba", "network": {"id": "dbab563d-60c4-40bc-bce1-206a3a45dd33", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1634205887-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "32db76789e284dffa4b1b13768550e15", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap49e510d7-61", "ovs_interfaceid": "49e510d7-6108-401c-a08e-2fc03655e226", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 2206.704588] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:78:46:ba', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3f4a795c-8718-4a7c-aafe-9da231df10f8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '49e510d7-6108-401c-a08e-2fc03655e226', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2206.713669] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Creating folder: Project (32db76789e284dffa4b1b13768550e15). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2206.714262] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ce4e7b45-9499-495b-b279-9bcdef922a73 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2206.716525] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2206.716525] nova-compute[62208]: warnings.warn( [ 2206.727378] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Created folder: Project (32db76789e284dffa4b1b13768550e15) in parent group-v17427. [ 2206.727608] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Creating folder: Instances. Parent ref: group-v17562. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2206.728143] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-819cfaff-e6d9-45d3-ad6f-9ffe548132c3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2206.729733] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2206.729733] nova-compute[62208]: warnings.warn( [ 2206.737560] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Created folder: Instances in parent group-v17562. [ 2206.737787] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2206.737973] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2206.738236] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cab302dc-e1b1-4488-8aaa-0b47f6612701 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2206.752194] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2206.752194] nova-compute[62208]: warnings.warn( [ 2206.760361] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2206.760361] nova-compute[62208]: value = "task-38678" [ 2206.760361] nova-compute[62208]: _type = "Task" [ 2206.760361] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2206.763323] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2206.763323] nova-compute[62208]: warnings.warn( [ 2206.768395] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38678, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2207.264773] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2207.264773] nova-compute[62208]: warnings.warn( [ 2207.271005] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38678, 'name': CreateVM_Task, 'duration_secs': 0.307182} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2207.271188] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2207.271799] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2207.272055] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2207.274864] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba9a4695-350a-4469-ade6-c9e86bd39d3f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2207.285206] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2207.285206] nova-compute[62208]: warnings.warn( [ 2207.308388] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Reconfiguring VM instance to enable vnc on port - 5904 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2207.308802] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-993dedde-5f9a-4e9d-8d70-83d4448de0c9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2207.319076] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2207.319076] nova-compute[62208]: warnings.warn( [ 2207.324990] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Waiting for the task: (returnval){ [ 2207.324990] nova-compute[62208]: value = "task-38679" [ 2207.324990] nova-compute[62208]: _type = "Task" [ 2207.324990] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2207.328120] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2207.328120] nova-compute[62208]: warnings.warn( [ 2207.333824] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Task: {'id': task-38679, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2207.828970] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2207.828970] nova-compute[62208]: warnings.warn( [ 2207.836802] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Task: {'id': task-38679, 'name': ReconfigVM_Task, 'duration_secs': 0.109326} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2207.837172] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Reconfigured VM instance to enable vnc on port - 5904 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2207.837463] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.565s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2207.837795] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2207.837998] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2207.838451] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2207.838779] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e0d5fe37-e940-4dd1-858b-85c6d23a527b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2207.840968] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2207.840968] nova-compute[62208]: warnings.warn( [ 2207.844767] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Waiting for the task: (returnval){ [ 2207.844767] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]528f1757-a7bd-a028-977a-4f08836e4923" [ 2207.844767] nova-compute[62208]: _type = "Task" [ 2207.844767] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2207.849063] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2207.849063] nova-compute[62208]: warnings.warn( [ 2207.855758] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]528f1757-a7bd-a028-977a-4f08836e4923, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2208.352053] nova-compute[62208]: DEBUG nova.compute.manager [req-357e3d99-4879-4dc8-8c6f-33b368e8d40e req-1a0e9211-b471-4958-b4d0-e66fd156bf90 service nova] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Received event network-changed-49e510d7-6108-401c-a08e-2fc03655e226 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2208.352320] nova-compute[62208]: DEBUG nova.compute.manager [req-357e3d99-4879-4dc8-8c6f-33b368e8d40e req-1a0e9211-b471-4958-b4d0-e66fd156bf90 service nova] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Refreshing instance network info cache due to event network-changed-49e510d7-6108-401c-a08e-2fc03655e226. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 2208.352320] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-357e3d99-4879-4dc8-8c6f-33b368e8d40e req-1a0e9211-b471-4958-b4d0-e66fd156bf90 service nova] Acquiring lock "refresh_cache-91c64da9-f295-4e84-a8bd-149a72a239da" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2208.352565] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-357e3d99-4879-4dc8-8c6f-33b368e8d40e req-1a0e9211-b471-4958-b4d0-e66fd156bf90 service nova] Acquired lock "refresh_cache-91c64da9-f295-4e84-a8bd-149a72a239da" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2208.352624] nova-compute[62208]: DEBUG nova.network.neutron [req-357e3d99-4879-4dc8-8c6f-33b368e8d40e req-1a0e9211-b471-4958-b4d0-e66fd156bf90 service nova] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Refreshing network info cache for port 49e510d7-6108-401c-a08e-2fc03655e226 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2208.353565] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2208.353565] nova-compute[62208]: warnings.warn( [ 2208.359905] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2208.360136] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2208.360353] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2208.579171] nova-compute[62208]: DEBUG nova.network.neutron [req-357e3d99-4879-4dc8-8c6f-33b368e8d40e req-1a0e9211-b471-4958-b4d0-e66fd156bf90 service nova] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Updated VIF entry in instance network info cache for port 49e510d7-6108-401c-a08e-2fc03655e226. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2208.579570] nova-compute[62208]: DEBUG nova.network.neutron [req-357e3d99-4879-4dc8-8c6f-33b368e8d40e req-1a0e9211-b471-4958-b4d0-e66fd156bf90 service nova] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Updating instance_info_cache with network_info: [{"id": "49e510d7-6108-401c-a08e-2fc03655e226", "address": "fa:16:3e:78:46:ba", "network": {"id": "dbab563d-60c4-40bc-bce1-206a3a45dd33", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1634205887-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "32db76789e284dffa4b1b13768550e15", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap49e510d7-61", "ovs_interfaceid": "49e510d7-6108-401c-a08e-2fc03655e226", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2208.590689] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-357e3d99-4879-4dc8-8c6f-33b368e8d40e req-1a0e9211-b471-4958-b4d0-e66fd156bf90 service nova] Releasing lock "refresh_cache-91c64da9-f295-4e84-a8bd-149a72a239da" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2209.141896] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2214.142674] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2216.142200] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2216.142575] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2216.142575] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2216.160747] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2216.160913] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2216.161041] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2216.161169] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2216.161292] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2216.161414] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2216.161535] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2216.161652] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2216.161770] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2217.141531] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2218.143534] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2218.144319] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2218.165074] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2218.165360] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2218.165620] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2218.165620] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2218.166896] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-296e0723-7060-40e7-aab5-fbf6322625b0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2218.169589] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2218.169589] nova-compute[62208]: warnings.warn( [ 2218.175916] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08d25413-6db3-408f-bb74-1c0864b15609 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2218.179582] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2218.179582] nova-compute[62208]: warnings.warn( [ 2218.190994] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59013491-2d4a-42da-9b79-ce369f67917b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2218.193547] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2218.193547] nova-compute[62208]: warnings.warn( [ 2218.198748] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3a2d6f8-5b00-4fad-919a-17ffe7edc282 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2218.201911] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2218.201911] nova-compute[62208]: warnings.warn( [ 2218.227847] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181934MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2218.228025] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2218.228405] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2218.287674] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2218.287854] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e0a444fc-dca2-419a-9ac1-8d71048e1690 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2218.287986] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 911fbbcc-69d5-479f-87f1-2561fcb3dd6b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2218.288123] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b797610-f460-461c-8c5a-1a28cf162c0e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2218.288272] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2218.288407] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec568c91-b110-4c2a-8d62-8127c7781d03 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2218.288526] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b63cd2f-0b14-4008-b564-0078d3e0e20a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2218.288641] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 91c64da9-f295-4e84-a8bd-149a72a239da actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2218.288826] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2218.288962] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2218.399717] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8dd1127d-d595-4a96-8b96-853e77387853 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2218.402357] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2218.402357] nova-compute[62208]: warnings.warn( [ 2218.408833] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8bf8332-8b87-40e1-ba6e-6e4055813448 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2218.411877] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2218.411877] nova-compute[62208]: warnings.warn( [ 2218.438092] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5778c04-67cc-4825-a58a-c33db0bb0ba9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2218.440585] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2218.440585] nova-compute[62208]: warnings.warn( [ 2218.445711] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8a91d4f-462e-4253-a4b4-e544bc81ad2e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2218.449502] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2218.449502] nova-compute[62208]: warnings.warn( [ 2218.458967] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2218.467932] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2218.486321] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2218.486526] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2220.484642] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2220.485028] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2220.485253] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2221.136704] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2232.135826] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2240.868065] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2240.868065] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2240.868065] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2240.868065] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2240.868065] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2240.868065] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2240.868065] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2240.868065] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2240.868065] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2240.868065] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2240.868065] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2240.868065] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2240.868065] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/3616f08a-8bfe-4d8e-a022-9300940f45b7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2240.868065] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2240.868065] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Copying Virtual Disk [datastore2] vmware_temp/3616f08a-8bfe-4d8e-a022-9300940f45b7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/3616f08a-8bfe-4d8e-a022-9300940f45b7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2240.868065] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ed8ae9f2-4c00-4dfb-be29-5e4fdd2be746 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.871348] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2240.871348] nova-compute[62208]: warnings.warn( [ 2240.878145] nova-compute[62208]: DEBUG oslo_vmware.api [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2240.878145] nova-compute[62208]: value = "task-38680" [ 2240.878145] nova-compute[62208]: _type = "Task" [ 2240.878145] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2240.882487] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2240.882487] nova-compute[62208]: warnings.warn( [ 2240.888336] nova-compute[62208]: DEBUG oslo_vmware.api [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38680, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2241.384087] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.384087] nova-compute[62208]: warnings.warn( [ 2241.390011] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2241.390306] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2241.390900] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Traceback (most recent call last): [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] yield resources [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] self.driver.spawn(context, instance, image_meta, [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] self._fetch_image_if_missing(context, vi) [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] image_cache(vi, tmp_image_ds_loc) [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] vm_util.copy_virtual_disk( [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] session._wait_for_task(vmdk_copy_task) [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] return self.wait_for_task(task_ref) [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] return evt.wait() [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] result = hub.switch() [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] return self.greenlet.switch() [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] self.f(*self.args, **self.kw) [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] raise exceptions.translate_fault(task_info.error) [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Faults: ['InvalidArgument'] [ 2241.390900] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] [ 2241.391801] nova-compute[62208]: INFO nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Terminating instance [ 2241.394306] nova-compute[62208]: DEBUG nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2241.394502] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2241.394782] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2241.394976] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2241.395844] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b318601a-ddcc-424a-abeb-81ed10db8fc1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.398559] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-070d1015-bc5f-439f-88fd-67d53a4fa89b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.400535] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.400535] nova-compute[62208]: warnings.warn( [ 2241.400940] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.400940] nova-compute[62208]: warnings.warn( [ 2241.405980] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2241.406276] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0ec79106-7686-40fc-b18d-b06a06a7287a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.408816] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2241.408994] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2241.409650] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.409650] nova-compute[62208]: warnings.warn( [ 2241.410138] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-84f63fdc-15bb-4b08-9b3d-a1f0830a6925 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.412263] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.412263] nova-compute[62208]: warnings.warn( [ 2241.415464] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 2241.415464] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]526f45cc-cb05-98bc-5da8-a5239f7d2b73" [ 2241.415464] nova-compute[62208]: _type = "Task" [ 2241.415464] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2241.418966] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.418966] nova-compute[62208]: warnings.warn( [ 2241.423885] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]526f45cc-cb05-98bc-5da8-a5239f7d2b73, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2241.479283] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2241.479598] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2241.479827] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleting the datastore file [datastore2] 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2241.480182] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-02040d6a-7e58-48b4-a9f2-28ecf808ffb6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.482045] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.482045] nova-compute[62208]: warnings.warn( [ 2241.487438] nova-compute[62208]: DEBUG oslo_vmware.api [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2241.487438] nova-compute[62208]: value = "task-38682" [ 2241.487438] nova-compute[62208]: _type = "Task" [ 2241.487438] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2241.490593] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.490593] nova-compute[62208]: warnings.warn( [ 2241.495847] nova-compute[62208]: DEBUG oslo_vmware.api [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38682, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2241.919649] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.919649] nova-compute[62208]: warnings.warn( [ 2241.926269] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2241.926534] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Creating directory with path [datastore2] vmware_temp/dc6a4878-51d9-47dc-b259-7055b89111c5/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2241.926817] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-40cfa679-e1c6-4420-afd4-27468bfc3b1e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.928603] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.928603] nova-compute[62208]: warnings.warn( [ 2241.938651] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Created directory with path [datastore2] vmware_temp/dc6a4878-51d9-47dc-b259-7055b89111c5/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2241.938880] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Fetch image to [datastore2] vmware_temp/dc6a4878-51d9-47dc-b259-7055b89111c5/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2241.939057] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/dc6a4878-51d9-47dc-b259-7055b89111c5/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2241.939787] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43bcee5f-e93d-49ee-bb66-279b1ab064f5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.942125] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.942125] nova-compute[62208]: warnings.warn( [ 2241.946637] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7e9516f-83f6-4f5e-aef6-d76e0bf45edb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.948800] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.948800] nova-compute[62208]: warnings.warn( [ 2241.955633] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1d4f408-4b6b-459d-9bbd-1c6d64e9b3cf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.959024] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.959024] nova-compute[62208]: warnings.warn( [ 2241.987095] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b36e3a1-110b-4ee3-ae88-cb1e61c34cd9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.992256] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.992256] nova-compute[62208]: warnings.warn( [ 2241.992616] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2241.992616] nova-compute[62208]: warnings.warn( [ 2241.998357] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-433bffc1-83f3-454d-9c20-5b1285c7848f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2242.000160] nova-compute[62208]: DEBUG oslo_vmware.api [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38682, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084227} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2242.000439] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2242.000634] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2242.000805] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2242.000985] nova-compute[62208]: INFO nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2242.003012] nova-compute[62208]: DEBUG nova.compute.claims [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Aborting claim: <nova.compute.claims.Claim object at 0x7fb93583cdf0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2242.003185] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2242.003443] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2242.005976] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2242.005976] nova-compute[62208]: warnings.warn( [ 2242.026187] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2242.162467] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dc6a4878-51d9-47dc-b259-7055b89111c5/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2242.220064] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e24288dd-c068-4382-a597-24ad4262daad {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2242.222625] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2242.222625] nova-compute[62208]: warnings.warn( [ 2242.225212] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2242.225418] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dc6a4878-51d9-47dc-b259-7055b89111c5/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2242.228848] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb27dc02-e78c-473c-b01e-b090f53000aa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2242.231933] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2242.231933] nova-compute[62208]: warnings.warn( [ 2242.259128] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84ced940-b96d-44b4-b86b-a4cd35037805 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2242.261671] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2242.261671] nova-compute[62208]: warnings.warn( [ 2242.267088] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9a56225-5c72-432c-b406-bffb99db6bae {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2242.270860] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2242.270860] nova-compute[62208]: warnings.warn( [ 2242.280542] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2242.289048] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2242.305883] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.302s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2242.306467] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Traceback (most recent call last): [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] self.driver.spawn(context, instance, image_meta, [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] self._fetch_image_if_missing(context, vi) [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] image_cache(vi, tmp_image_ds_loc) [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] vm_util.copy_virtual_disk( [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] session._wait_for_task(vmdk_copy_task) [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] return self.wait_for_task(task_ref) [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] return evt.wait() [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] result = hub.switch() [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] return self.greenlet.switch() [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] self.f(*self.args, **self.kw) [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] raise exceptions.translate_fault(task_info.error) [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Faults: ['InvalidArgument'] [ 2242.306467] nova-compute[62208]: ERROR nova.compute.manager [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] [ 2242.307236] nova-compute[62208]: DEBUG nova.compute.utils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2242.308742] nova-compute[62208]: DEBUG nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Build of instance 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 was re-scheduled: A specified parameter was not correct: fileType [ 2242.308742] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2242.309111] nova-compute[62208]: DEBUG nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2242.309289] nova-compute[62208]: DEBUG nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2242.309474] nova-compute[62208]: DEBUG nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2242.309660] nova-compute[62208]: DEBUG nova.network.neutron [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2242.582330] nova-compute[62208]: DEBUG nova.network.neutron [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2242.600965] nova-compute[62208]: INFO nova.compute.manager [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Took 0.29 seconds to deallocate network for instance. [ 2242.715626] nova-compute[62208]: INFO nova.scheduler.client.report [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleted allocations for instance 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 [ 2242.735346] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5112e606-af69-46ff-9e69-59dee92ef2b6 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 633.877s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2242.735601] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 443.327s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2242.735783] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] During sync_power_state the instance has a pending task (networking). Skip. [ 2242.735950] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2242.736216] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-53e55a45-e9d1-4b8b-b0ca-16c2f9c58f39 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 438.121s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2242.736437] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-53e55a45-e9d1-4b8b-b0ca-16c2f9c58f39 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2242.736633] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-53e55a45-e9d1-4b8b-b0ca-16c2f9c58f39 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2242.736791] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-53e55a45-e9d1-4b8b-b0ca-16c2f9c58f39 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2242.738697] nova-compute[62208]: INFO nova.compute.manager [None req-53e55a45-e9d1-4b8b-b0ca-16c2f9c58f39 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Terminating instance [ 2242.740447] nova-compute[62208]: DEBUG nova.compute.manager [None req-53e55a45-e9d1-4b8b-b0ca-16c2f9c58f39 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2242.740630] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-53e55a45-e9d1-4b8b-b0ca-16c2f9c58f39 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2242.741090] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6f667a59-7d24-4024-aa9d-1f8acc7f13ab {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2242.743330] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2242.743330] nova-compute[62208]: warnings.warn( [ 2242.750425] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d23ad3f6-2cf9-4809-b966-fc79b9982019 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2242.760708] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2242.760708] nova-compute[62208]: warnings.warn( [ 2242.777393] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-53e55a45-e9d1-4b8b-b0ca-16c2f9c58f39 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3ab2890c-e3d2-43e8-bab4-e3ba689a0529 could not be found. [ 2242.777594] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-53e55a45-e9d1-4b8b-b0ca-16c2f9c58f39 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2242.777754] nova-compute[62208]: INFO nova.compute.manager [None req-53e55a45-e9d1-4b8b-b0ca-16c2f9c58f39 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2242.777995] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-53e55a45-e9d1-4b8b-b0ca-16c2f9c58f39 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2242.778226] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2242.778328] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2242.805530] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2242.814089] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 3ab2890c-e3d2-43e8-bab4-e3ba689a0529] Took 0.04 seconds to deallocate network for instance. [ 2242.904626] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-53e55a45-e9d1-4b8b-b0ca-16c2f9c58f39 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "3ab2890c-e3d2-43e8-bab4-e3ba689a0529" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.168s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2247.391799] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "75ca5bb3-c856-4548-924f-3ff3614b0f63" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2247.391799] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "75ca5bb3-c856-4548-924f-3ff3614b0f63" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2247.404596] nova-compute[62208]: DEBUG nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 2247.463835] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2247.464145] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2247.465639] nova-compute[62208]: INFO nova.compute.claims [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2247.623093] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1ca62f1-efdb-4aeb-bf8f-18a42c8471e7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.625677] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2247.625677] nova-compute[62208]: warnings.warn( [ 2247.631597] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96bebec4-184c-4ded-8eb0-65e9dc1d522e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.634710] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2247.634710] nova-compute[62208]: warnings.warn( [ 2247.663698] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5f48b31-c0ff-46af-abed-e3aecde122c4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.666259] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2247.666259] nova-compute[62208]: warnings.warn( [ 2247.672049] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f56f7434-0900-4056-82c6-005fffce3717 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.677473] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2247.677473] nova-compute[62208]: warnings.warn( [ 2247.687778] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2247.697940] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2247.722868] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2247.723377] nova-compute[62208]: DEBUG nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 2247.758305] nova-compute[62208]: DEBUG nova.compute.utils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2247.759944] nova-compute[62208]: DEBUG nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2247.760285] nova-compute[62208]: DEBUG nova.network.neutron [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2247.770454] nova-compute[62208]: DEBUG nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 2247.808135] nova-compute[62208]: DEBUG nova.policy [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6cb3f0377ac64412bf238ba3e97ecd9a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4fb2ff705fe34117b2dfb9354ae8cfc8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 2247.841831] nova-compute[62208]: DEBUG nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 2247.865051] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2247.865526] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2247.865964] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2247.866295] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2247.866559] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2247.866829] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2247.867198] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2247.867488] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2247.867782] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2247.868081] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2247.868410] nova-compute[62208]: DEBUG nova.virt.hardware [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2247.869360] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01944dfa-197d-432e-922a-3ec271bc16c0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.871988] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2247.871988] nova-compute[62208]: warnings.warn( [ 2247.878150] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e899ba35-873b-4c80-ab40-3ca8cca82aba {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.883344] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2247.883344] nova-compute[62208]: warnings.warn( [ 2248.060485] nova-compute[62208]: DEBUG nova.network.neutron [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Successfully created port: 2668459e-5ce9-4e87-95fb-d7825dd5bc05 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2248.552280] nova-compute[62208]: DEBUG nova.compute.manager [req-fe1856d6-17cc-424a-8473-6142b11a6f6c req-e7f60bad-9d76-4213-9936-699102615d86 service nova] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Received event network-vif-plugged-2668459e-5ce9-4e87-95fb-d7825dd5bc05 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2248.552543] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-fe1856d6-17cc-424a-8473-6142b11a6f6c req-e7f60bad-9d76-4213-9936-699102615d86 service nova] Acquiring lock "75ca5bb3-c856-4548-924f-3ff3614b0f63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2248.552783] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-fe1856d6-17cc-424a-8473-6142b11a6f6c req-e7f60bad-9d76-4213-9936-699102615d86 service nova] Lock "75ca5bb3-c856-4548-924f-3ff3614b0f63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2248.552962] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-fe1856d6-17cc-424a-8473-6142b11a6f6c req-e7f60bad-9d76-4213-9936-699102615d86 service nova] Lock "75ca5bb3-c856-4548-924f-3ff3614b0f63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2248.553133] nova-compute[62208]: DEBUG nova.compute.manager [req-fe1856d6-17cc-424a-8473-6142b11a6f6c req-e7f60bad-9d76-4213-9936-699102615d86 service nova] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] No waiting events found dispatching network-vif-plugged-2668459e-5ce9-4e87-95fb-d7825dd5bc05 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2248.553298] nova-compute[62208]: WARNING nova.compute.manager [req-fe1856d6-17cc-424a-8473-6142b11a6f6c req-e7f60bad-9d76-4213-9936-699102615d86 service nova] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Received unexpected event network-vif-plugged-2668459e-5ce9-4e87-95fb-d7825dd5bc05 for instance with vm_state building and task_state spawning. [ 2248.682136] nova-compute[62208]: DEBUG nova.network.neutron [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Successfully updated port: 2668459e-5ce9-4e87-95fb-d7825dd5bc05 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2248.695484] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "refresh_cache-75ca5bb3-c856-4548-924f-3ff3614b0f63" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2248.695484] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "refresh_cache-75ca5bb3-c856-4548-924f-3ff3614b0f63" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2248.695484] nova-compute[62208]: DEBUG nova.network.neutron [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2248.736814] nova-compute[62208]: DEBUG nova.network.neutron [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2248.942084] nova-compute[62208]: DEBUG nova.network.neutron [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Updating instance_info_cache with network_info: [{"id": "2668459e-5ce9-4e87-95fb-d7825dd5bc05", "address": "fa:16:3e:49:dd:46", "network": {"id": "ce724fe2-66d9-4ed9-8fe0-d32189d49488", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1765832645-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "4fb2ff705fe34117b2dfb9354ae8cfc8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13af9422-d668-4413-b63a-766558d83a3b", "external-id": "nsx-vlan-transportzone-842", "segmentation_id": 842, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2668459e-5c", "ovs_interfaceid": "2668459e-5ce9-4e87-95fb-d7825dd5bc05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2248.957243] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "refresh_cache-75ca5bb3-c856-4548-924f-3ff3614b0f63" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2248.957636] nova-compute[62208]: DEBUG nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Instance network_info: |[{"id": "2668459e-5ce9-4e87-95fb-d7825dd5bc05", "address": "fa:16:3e:49:dd:46", "network": {"id": "ce724fe2-66d9-4ed9-8fe0-d32189d49488", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1765832645-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "4fb2ff705fe34117b2dfb9354ae8cfc8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13af9422-d668-4413-b63a-766558d83a3b", "external-id": "nsx-vlan-transportzone-842", "segmentation_id": 842, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2668459e-5c", "ovs_interfaceid": "2668459e-5ce9-4e87-95fb-d7825dd5bc05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 2248.958615] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:49:dd:46', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '13af9422-d668-4413-b63a-766558d83a3b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2668459e-5ce9-4e87-95fb-d7825dd5bc05', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2248.966341] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2248.966821] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2248.967053] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a41ee0ba-a842-442c-942c-ff311ff5d15f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2248.982311] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2248.982311] nova-compute[62208]: warnings.warn( [ 2248.988300] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2248.988300] nova-compute[62208]: value = "task-38683" [ 2248.988300] nova-compute[62208]: _type = "Task" [ 2248.988300] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2248.991823] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2248.991823] nova-compute[62208]: warnings.warn( [ 2248.997111] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38683, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2249.493579] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2249.493579] nova-compute[62208]: warnings.warn( [ 2249.499616] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38683, 'name': CreateVM_Task, 'duration_secs': 0.287662} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2249.499900] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2249.500567] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2249.500880] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2249.503720] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-135d0132-5f4e-498b-ad2c-a1de8546cb56 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2249.513692] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2249.513692] nova-compute[62208]: warnings.warn( [ 2249.534618] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Reconfiguring VM instance to enable vnc on port - 5905 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2249.535036] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-4af09314-181c-4ecc-ad1c-18ccf69c852f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2249.545035] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2249.545035] nova-compute[62208]: warnings.warn( [ 2249.550813] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 2249.550813] nova-compute[62208]: value = "task-38684" [ 2249.550813] nova-compute[62208]: _type = "Task" [ 2249.550813] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2249.553842] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2249.553842] nova-compute[62208]: warnings.warn( [ 2249.558904] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38684, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2250.055187] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2250.055187] nova-compute[62208]: warnings.warn( [ 2250.061839] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38684, 'name': ReconfigVM_Task, 'duration_secs': 0.109218} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2250.062126] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Reconfigured VM instance to enable vnc on port - 5905 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2250.062341] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.561s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2250.062597] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2250.062745] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2250.063088] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2250.063368] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1f94e357-d305-4784-8522-f875e8b1107f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2250.065100] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2250.065100] nova-compute[62208]: warnings.warn( [ 2250.069588] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 2250.069588] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52131860-3d1b-7790-3209-65fd899510e8" [ 2250.069588] nova-compute[62208]: _type = "Task" [ 2250.069588] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2250.072916] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2250.072916] nova-compute[62208]: warnings.warn( [ 2250.083831] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52131860-3d1b-7790-3209-65fd899510e8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2250.573803] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2250.573803] nova-compute[62208]: warnings.warn( [ 2250.578403] nova-compute[62208]: DEBUG nova.compute.manager [req-5b5ad32d-1480-4f3f-8355-b35c8dd6eb9f req-31587ae2-34f5-4b8b-8385-b93cc591fe17 service nova] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Received event network-changed-2668459e-5ce9-4e87-95fb-d7825dd5bc05 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2250.578616] nova-compute[62208]: DEBUG nova.compute.manager [req-5b5ad32d-1480-4f3f-8355-b35c8dd6eb9f req-31587ae2-34f5-4b8b-8385-b93cc591fe17 service nova] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Refreshing instance network info cache due to event network-changed-2668459e-5ce9-4e87-95fb-d7825dd5bc05. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 2250.578823] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-5b5ad32d-1480-4f3f-8355-b35c8dd6eb9f req-31587ae2-34f5-4b8b-8385-b93cc591fe17 service nova] Acquiring lock "refresh_cache-75ca5bb3-c856-4548-924f-3ff3614b0f63" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2250.578958] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-5b5ad32d-1480-4f3f-8355-b35c8dd6eb9f req-31587ae2-34f5-4b8b-8385-b93cc591fe17 service nova] Acquired lock "refresh_cache-75ca5bb3-c856-4548-924f-3ff3614b0f63" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2250.579122] nova-compute[62208]: DEBUG nova.network.neutron [req-5b5ad32d-1480-4f3f-8355-b35c8dd6eb9f req-31587ae2-34f5-4b8b-8385-b93cc591fe17 service nova] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Refreshing network info cache for port 2668459e-5ce9-4e87-95fb-d7825dd5bc05 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2250.584129] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2250.584404] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2250.584624] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2250.799083] nova-compute[62208]: DEBUG nova.network.neutron [req-5b5ad32d-1480-4f3f-8355-b35c8dd6eb9f req-31587ae2-34f5-4b8b-8385-b93cc591fe17 service nova] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Updated VIF entry in instance network info cache for port 2668459e-5ce9-4e87-95fb-d7825dd5bc05. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2250.799469] nova-compute[62208]: DEBUG nova.network.neutron [req-5b5ad32d-1480-4f3f-8355-b35c8dd6eb9f req-31587ae2-34f5-4b8b-8385-b93cc591fe17 service nova] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Updating instance_info_cache with network_info: [{"id": "2668459e-5ce9-4e87-95fb-d7825dd5bc05", "address": "fa:16:3e:49:dd:46", "network": {"id": "ce724fe2-66d9-4ed9-8fe0-d32189d49488", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1765832645-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "4fb2ff705fe34117b2dfb9354ae8cfc8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13af9422-d668-4413-b63a-766558d83a3b", "external-id": "nsx-vlan-transportzone-842", "segmentation_id": 842, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2668459e-5c", "ovs_interfaceid": "2668459e-5ce9-4e87-95fb-d7825dd5bc05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2250.808628] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-5b5ad32d-1480-4f3f-8355-b35c8dd6eb9f req-31587ae2-34f5-4b8b-8385-b93cc591fe17 service nova] Releasing lock "refresh_cache-75ca5bb3-c856-4548-924f-3ff3614b0f63" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2271.140673] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2276.141862] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2277.141677] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2277.141884] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2277.142119] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2277.161849] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2277.162005] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2277.162135] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2277.162266] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2277.162392] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2277.162515] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2277.162635] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2277.162755] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2277.162875] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2278.141277] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2279.140610] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2280.141492] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2280.152334] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2280.152573] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2280.152743] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2280.152902] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2280.154050] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85b63eba-7c51-4691-8d42-7d7a2201a6b0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.157039] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2280.157039] nova-compute[62208]: warnings.warn( [ 2280.163230] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00ae6c65-318a-43d3-aa8e-1cdedea12f47 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.167083] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2280.167083] nova-compute[62208]: warnings.warn( [ 2280.177143] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c35adfed-29dc-4d61-b998-108ca026fe1e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.179347] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2280.179347] nova-compute[62208]: warnings.warn( [ 2280.183498] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4cf17e8-0448-4b43-88f5-c02594ed4243 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.186363] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2280.186363] nova-compute[62208]: warnings.warn( [ 2280.215460] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181951MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2280.215680] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2280.215945] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2280.275306] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e0a444fc-dca2-419a-9ac1-8d71048e1690 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2280.275475] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 911fbbcc-69d5-479f-87f1-2561fcb3dd6b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2280.275611] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b797610-f460-461c-8c5a-1a28cf162c0e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2280.275730] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2280.275849] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec568c91-b110-4c2a-8d62-8127c7781d03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2280.275968] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b63cd2f-0b14-4008-b564-0078d3e0e20a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2280.276102] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 91c64da9-f295-4e84-a8bd-149a72a239da actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2280.276218] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 75ca5bb3-c856-4548-924f-3ff3614b0f63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2280.276406] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2280.276541] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2280.375860] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0570e0c5-c36c-41ad-a79c-53ebf2d728d4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.378291] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2280.378291] nova-compute[62208]: warnings.warn( [ 2280.383978] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e14afaac-76e7-45d5-a14a-ea2f8c08e2cc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.386823] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2280.386823] nova-compute[62208]: warnings.warn( [ 2280.413249] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aeb9cc56-ad97-4a93-8d73-7ff6d2558bad {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.415594] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2280.415594] nova-compute[62208]: warnings.warn( [ 2280.420481] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a7294f5-78ab-40be-a27b-6a3824d07d62 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2280.423985] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2280.423985] nova-compute[62208]: warnings.warn( [ 2280.433618] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2280.441591] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2280.463083] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2280.463216] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2281.457590] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2282.140850] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2282.141105] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2282.141254] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2289.778773] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2289.778773] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2289.778773] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2289.778773] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2289.778773] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2289.778773] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2289.778773] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2289.778773] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2289.778773] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2289.778773] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2289.778773] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2289.778773] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2289.779374] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/dc6a4878-51d9-47dc-b259-7055b89111c5/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2289.783695] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2289.783695] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Copying Virtual Disk [datastore2] vmware_temp/dc6a4878-51d9-47dc-b259-7055b89111c5/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/dc6a4878-51d9-47dc-b259-7055b89111c5/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2289.783695] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7ed6dac4-b374-4c5f-9cb7-5e19b98cb2c1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.784199] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2289.784199] nova-compute[62208]: warnings.warn( [ 2289.793066] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 2289.793066] nova-compute[62208]: value = "task-38685" [ 2289.793066] nova-compute[62208]: _type = "Task" [ 2289.793066] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2289.796079] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2289.796079] nova-compute[62208]: warnings.warn( [ 2289.801278] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38685, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2290.297182] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.297182] nova-compute[62208]: warnings.warn( [ 2290.303163] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2290.303452] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2290.304118] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Traceback (most recent call last): [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] yield resources [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] self.driver.spawn(context, instance, image_meta, [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] self._fetch_image_if_missing(context, vi) [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] image_cache(vi, tmp_image_ds_loc) [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] vm_util.copy_virtual_disk( [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] session._wait_for_task(vmdk_copy_task) [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] return self.wait_for_task(task_ref) [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] return evt.wait() [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] result = hub.switch() [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] return self.greenlet.switch() [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] self.f(*self.args, **self.kw) [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] raise exceptions.translate_fault(task_info.error) [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Faults: ['InvalidArgument'] [ 2290.304118] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] [ 2290.305187] nova-compute[62208]: INFO nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Terminating instance [ 2290.307458] nova-compute[62208]: DEBUG nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2290.307654] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2290.307947] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2290.308159] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2290.308924] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbe707dc-b095-47f8-a934-8dc9a64be4bf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.311683] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d46c290a-8bba-498a-8d2b-b8e47f568005 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.313200] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.313200] nova-compute[62208]: warnings.warn( [ 2290.313534] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.313534] nova-compute[62208]: warnings.warn( [ 2290.317754] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2290.317978] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c3fb1e1a-f8ba-4ad7-b64b-8d707a3d29ac {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.320211] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2290.320399] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2290.320961] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.320961] nova-compute[62208]: warnings.warn( [ 2290.321352] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6cd4df72-1da6-4fb4-b01c-96fdd3b85796 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.323453] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.323453] nova-compute[62208]: warnings.warn( [ 2290.326128] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for the task: (returnval){ [ 2290.326128] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]524b834f-e89e-b69e-5394-79886e3952e6" [ 2290.326128] nova-compute[62208]: _type = "Task" [ 2290.326128] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2290.328821] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.328821] nova-compute[62208]: warnings.warn( [ 2290.336735] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]524b834f-e89e-b69e-5394-79886e3952e6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2290.402673] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2290.402841] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2290.403022] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Deleting the datastore file [datastore2] e0a444fc-dca2-419a-9ac1-8d71048e1690 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2290.403299] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e6073295-5d7e-4285-8c45-d743970d5975 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.405131] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.405131] nova-compute[62208]: warnings.warn( [ 2290.409964] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for the task: (returnval){ [ 2290.409964] nova-compute[62208]: value = "task-38687" [ 2290.409964] nova-compute[62208]: _type = "Task" [ 2290.409964] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2290.413522] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.413522] nova-compute[62208]: warnings.warn( [ 2290.418301] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38687, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2290.829985] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.829985] nova-compute[62208]: warnings.warn( [ 2290.835865] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2290.836157] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Creating directory with path [datastore2] vmware_temp/50cab52c-26d9-4907-aad2-be14f3702fe6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2290.836419] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ac87e256-6819-43c8-8884-cf877895f694 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.838084] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.838084] nova-compute[62208]: warnings.warn( [ 2290.849117] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Created directory with path [datastore2] vmware_temp/50cab52c-26d9-4907-aad2-be14f3702fe6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2290.849311] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Fetch image to [datastore2] vmware_temp/50cab52c-26d9-4907-aad2-be14f3702fe6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2290.849477] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/50cab52c-26d9-4907-aad2-be14f3702fe6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2290.850243] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6b8ec8d-1142-4aba-8f1d-041107410280 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.852799] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.852799] nova-compute[62208]: warnings.warn( [ 2290.857848] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3086cd22-e3cc-476f-a6df-e64faaad28a6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.860270] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.860270] nova-compute[62208]: warnings.warn( [ 2290.867458] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd0c329d-66f0-4c1d-a8e6-3ea2db720489 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.871048] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.871048] nova-compute[62208]: warnings.warn( [ 2290.901708] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deb6e56d-1e7b-44c3-affe-5392b19bed82 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.904182] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.904182] nova-compute[62208]: warnings.warn( [ 2290.908378] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-83f7cd63-1380-433e-a28d-11a25582f38d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.910183] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.910183] nova-compute[62208]: warnings.warn( [ 2290.913625] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2290.913625] nova-compute[62208]: warnings.warn( [ 2290.921233] nova-compute[62208]: DEBUG oslo_vmware.api [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Task: {'id': task-38687, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.085758} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2290.921483] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2290.921659] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2290.921829] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2290.922003] nova-compute[62208]: INFO nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2290.924235] nova-compute[62208]: DEBUG nova.compute.claims [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Aborting claim: <nova.compute.claims.Claim object at 0x7fb935de0400> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2290.924427] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2290.924683] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2290.932285] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2290.986155] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/50cab52c-26d9-4907-aad2-be14f3702fe6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2291.053619] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2291.053881] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/50cab52c-26d9-4907-aad2-be14f3702fe6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2291.145208] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-880f86bb-2113-4c9d-9e0b-a926cd20ac85 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.147777] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2291.147777] nova-compute[62208]: warnings.warn( [ 2291.153225] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffbacdd5-eac9-48b3-94fb-0b10750278a9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.156808] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2291.156808] nova-compute[62208]: warnings.warn( [ 2291.183359] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06f8e909-8ccf-46f4-8c4e-7f042ecdbedc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.185829] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2291.185829] nova-compute[62208]: warnings.warn( [ 2291.191021] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b7c16a4-859c-47a2-a595-05969c769b71 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.194625] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2291.194625] nova-compute[62208]: warnings.warn( [ 2291.204365] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2291.214355] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2291.230753] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.306s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2291.231351] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Traceback (most recent call last): [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] self.driver.spawn(context, instance, image_meta, [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] self._fetch_image_if_missing(context, vi) [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] image_cache(vi, tmp_image_ds_loc) [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] vm_util.copy_virtual_disk( [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] session._wait_for_task(vmdk_copy_task) [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] return self.wait_for_task(task_ref) [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] return evt.wait() [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] result = hub.switch() [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] return self.greenlet.switch() [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] self.f(*self.args, **self.kw) [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] raise exceptions.translate_fault(task_info.error) [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Faults: ['InvalidArgument'] [ 2291.231351] nova-compute[62208]: ERROR nova.compute.manager [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] [ 2291.232296] nova-compute[62208]: DEBUG nova.compute.utils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2291.233591] nova-compute[62208]: DEBUG nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Build of instance e0a444fc-dca2-419a-9ac1-8d71048e1690 was re-scheduled: A specified parameter was not correct: fileType [ 2291.233591] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2291.233992] nova-compute[62208]: DEBUG nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2291.234174] nova-compute[62208]: DEBUG nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2291.234345] nova-compute[62208]: DEBUG nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2291.234510] nova-compute[62208]: DEBUG nova.network.neutron [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2291.500352] nova-compute[62208]: DEBUG nova.network.neutron [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2291.514194] nova-compute[62208]: INFO nova.compute.manager [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Took 0.28 seconds to deallocate network for instance. [ 2291.616153] nova-compute[62208]: INFO nova.scheduler.client.report [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Deleted allocations for instance e0a444fc-dca2-419a-9ac1-8d71048e1690 [ 2291.637417] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b98fa270-ddce-4186-a57d-f8ff965b8ae9 tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "e0a444fc-dca2-419a-9ac1-8d71048e1690" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 534.455s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2291.637972] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-50660845-8d2f-4a33-8080-a745c07882df tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "e0a444fc-dca2-419a-9ac1-8d71048e1690" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 338.658s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2291.638420] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-50660845-8d2f-4a33-8080-a745c07882df tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Acquiring lock "e0a444fc-dca2-419a-9ac1-8d71048e1690-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2291.638787] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-50660845-8d2f-4a33-8080-a745c07882df tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "e0a444fc-dca2-419a-9ac1-8d71048e1690-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2291.639109] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-50660845-8d2f-4a33-8080-a745c07882df tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "e0a444fc-dca2-419a-9ac1-8d71048e1690-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2291.642986] nova-compute[62208]: INFO nova.compute.manager [None req-50660845-8d2f-4a33-8080-a745c07882df tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Terminating instance [ 2291.645545] nova-compute[62208]: DEBUG nova.compute.manager [None req-50660845-8d2f-4a33-8080-a745c07882df tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2291.645908] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-50660845-8d2f-4a33-8080-a745c07882df tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2291.646302] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-56fe17e4-9f11-4528-93e5-0a431ee6e29f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.648385] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2291.648385] nova-compute[62208]: warnings.warn( [ 2291.655734] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0faeeba3-84e2-42f8-af29-ba2b5094db2e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.666320] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2291.666320] nova-compute[62208]: warnings.warn( [ 2291.683784] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-50660845-8d2f-4a33-8080-a745c07882df tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e0a444fc-dca2-419a-9ac1-8d71048e1690 could not be found. [ 2291.684240] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-50660845-8d2f-4a33-8080-a745c07882df tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2291.684626] nova-compute[62208]: INFO nova.compute.manager [None req-50660845-8d2f-4a33-8080-a745c07882df tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2291.685044] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-50660845-8d2f-4a33-8080-a745c07882df tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2291.685428] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2291.685626] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2291.713136] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2291.723481] nova-compute[62208]: INFO nova.compute.manager [-] [instance: e0a444fc-dca2-419a-9ac1-8d71048e1690] Took 0.04 seconds to deallocate network for instance. [ 2291.824646] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-50660845-8d2f-4a33-8080-a745c07882df tempest-AttachVolumeNegativeTest-648215624 tempest-AttachVolumeNegativeTest-648215624-project-member] Lock "e0a444fc-dca2-419a-9ac1-8d71048e1690" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.187s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2331.141168] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2336.142127] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2337.141877] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2337.142071] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2337.142186] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2337.159239] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2337.159444] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2337.159558] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2337.159690] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2337.159816] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2337.159939] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2337.160071] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2337.160196] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2337.202025] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2337.202025] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2337.202025] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2337.202025] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2337.202025] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2337.202025] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2337.202025] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2337.202025] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2337.202025] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2337.202025] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2337.202025] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2337.202025] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2337.202485] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/50cab52c-26d9-4907-aad2-be14f3702fe6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2337.204435] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2337.204670] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Copying Virtual Disk [datastore2] vmware_temp/50cab52c-26d9-4907-aad2-be14f3702fe6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/50cab52c-26d9-4907-aad2-be14f3702fe6/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2337.206421] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-340c8194-77f3-4c9a-b45f-67c4eb813387 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.208736] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2337.208736] nova-compute[62208]: warnings.warn( [ 2337.215351] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for the task: (returnval){ [ 2337.215351] nova-compute[62208]: value = "task-38688" [ 2337.215351] nova-compute[62208]: _type = "Task" [ 2337.215351] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2337.218382] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2337.218382] nova-compute[62208]: warnings.warn( [ 2337.223599] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Task: {'id': task-38688, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2337.720729] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2337.720729] nova-compute[62208]: warnings.warn( [ 2337.727551] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2337.727837] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2337.728431] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Traceback (most recent call last): [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] yield resources [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] self.driver.spawn(context, instance, image_meta, [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] self._fetch_image_if_missing(context, vi) [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] image_cache(vi, tmp_image_ds_loc) [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] vm_util.copy_virtual_disk( [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] session._wait_for_task(vmdk_copy_task) [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] return self.wait_for_task(task_ref) [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] return evt.wait() [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] result = hub.switch() [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] return self.greenlet.switch() [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] self.f(*self.args, **self.kw) [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] raise exceptions.translate_fault(task_info.error) [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Faults: ['InvalidArgument'] [ 2337.728431] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] [ 2337.729485] nova-compute[62208]: INFO nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Terminating instance [ 2337.730275] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2337.730495] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2337.730743] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6fa6d38e-40d2-4ff3-901c-3161c09cb48e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.733009] nova-compute[62208]: DEBUG nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2337.733195] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2337.733917] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd671941-dbe6-4a98-b19a-ab3b59a01f42 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.736450] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2337.736450] nova-compute[62208]: warnings.warn( [ 2337.736791] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2337.736791] nova-compute[62208]: warnings.warn( [ 2337.741280] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2337.741521] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1b56a0ce-efec-4b17-ab7f-c019759e6a07 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.743778] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2337.743943] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2337.744552] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2337.744552] nova-compute[62208]: warnings.warn( [ 2337.744975] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-38d40bd6-c104-4b5c-8e5a-2369c5ac2356 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.747567] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2337.747567] nova-compute[62208]: warnings.warn( [ 2337.750528] nova-compute[62208]: DEBUG oslo_vmware.api [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 2337.750528] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52c79fbc-6fb8-307d-257a-3322e631eb39" [ 2337.750528] nova-compute[62208]: _type = "Task" [ 2337.750528] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2337.753406] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2337.753406] nova-compute[62208]: warnings.warn( [ 2337.758520] nova-compute[62208]: DEBUG oslo_vmware.api [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52c79fbc-6fb8-307d-257a-3322e631eb39, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2337.812430] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2337.812729] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2337.812953] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Deleting the datastore file [datastore2] 911fbbcc-69d5-479f-87f1-2561fcb3dd6b {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2337.813258] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1dfcfaad-b9d3-4778-be86-178feebb30fc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.815072] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2337.815072] nova-compute[62208]: warnings.warn( [ 2337.820589] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for the task: (returnval){ [ 2337.820589] nova-compute[62208]: value = "task-38690" [ 2337.820589] nova-compute[62208]: _type = "Task" [ 2337.820589] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2337.823757] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2337.823757] nova-compute[62208]: warnings.warn( [ 2337.828596] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Task: {'id': task-38690, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2338.255196] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2338.255196] nova-compute[62208]: warnings.warn( [ 2338.261462] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2338.261784] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating directory with path [datastore2] vmware_temp/a5b0fcb2-72fc-45fd-afd6-b9fc80fe9b1a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2338.262058] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d8ebd194-1893-43fc-869a-6ad98b74dd68 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2338.263709] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2338.263709] nova-compute[62208]: warnings.warn( [ 2338.273483] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Created directory with path [datastore2] vmware_temp/a5b0fcb2-72fc-45fd-afd6-b9fc80fe9b1a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2338.273728] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Fetch image to [datastore2] vmware_temp/a5b0fcb2-72fc-45fd-afd6-b9fc80fe9b1a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2338.273941] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/a5b0fcb2-72fc-45fd-afd6-b9fc80fe9b1a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2338.274691] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54f21efc-6dc7-43f7-a99f-d9fea71422b6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2338.276982] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2338.276982] nova-compute[62208]: warnings.warn( [ 2338.281679] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c58195dd-1ae0-4167-9647-ca7f47e01dbd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2338.283850] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2338.283850] nova-compute[62208]: warnings.warn( [ 2338.290716] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d74265f-7059-4c1a-a41b-e7b9563333d3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2338.294131] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2338.294131] nova-compute[62208]: warnings.warn( [ 2338.324835] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3a88a9a-b7de-479b-a635-bbf9e0e09b36 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2338.327079] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2338.327079] nova-compute[62208]: warnings.warn( [ 2338.327508] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2338.327508] nova-compute[62208]: warnings.warn( [ 2338.332551] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Task: {'id': task-38690, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074316} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2338.334087] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2338.334335] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2338.334554] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2338.334771] nova-compute[62208]: INFO nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2338.336590] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-578bf50e-9404-4d82-9be2-9af8c65f8d64 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2338.338524] nova-compute[62208]: DEBUG nova.compute.claims [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9358141c0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2338.338743] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2338.338998] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2338.341830] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2338.341830] nova-compute[62208]: warnings.warn( [ 2338.361623] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2338.422899] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a5b0fcb2-72fc-45fd-afd6-b9fc80fe9b1a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2338.475227] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Refreshing inventories for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2338.480630] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2338.480630] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a5b0fcb2-72fc-45fd-afd6-b9fc80fe9b1a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2338.489253] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Updating ProviderTree inventory for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2338.489508] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Updating inventory in ProviderTree for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2338.501469] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Refreshing aggregate associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, aggregates: None {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2338.517732] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Refreshing trait associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2338.615949] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a70c7bfc-4723-4b20-ab10-61910eab5bdb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2338.619093] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2338.619093] nova-compute[62208]: warnings.warn( [ 2338.624559] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e451051-a81b-4686-87f8-275a33cf9beb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2338.627543] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2338.627543] nova-compute[62208]: warnings.warn( [ 2338.654104] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f338ce2c-0e73-414f-9aa9-efc432564339 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2338.656624] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2338.656624] nova-compute[62208]: warnings.warn( [ 2338.661886] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-951a1cef-09ce-438e-b485-f9c6c5f9df6e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2338.665526] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2338.665526] nova-compute[62208]: warnings.warn( [ 2338.675658] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2338.685892] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2338.701306] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.362s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2338.701827] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Traceback (most recent call last): [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] self.driver.spawn(context, instance, image_meta, [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] self._fetch_image_if_missing(context, vi) [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] image_cache(vi, tmp_image_ds_loc) [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] vm_util.copy_virtual_disk( [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] session._wait_for_task(vmdk_copy_task) [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] return self.wait_for_task(task_ref) [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] return evt.wait() [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] result = hub.switch() [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] return self.greenlet.switch() [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] self.f(*self.args, **self.kw) [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] raise exceptions.translate_fault(task_info.error) [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Faults: ['InvalidArgument'] [ 2338.701827] nova-compute[62208]: ERROR nova.compute.manager [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] [ 2338.702742] nova-compute[62208]: DEBUG nova.compute.utils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2338.703959] nova-compute[62208]: DEBUG nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Build of instance 911fbbcc-69d5-479f-87f1-2561fcb3dd6b was re-scheduled: A specified parameter was not correct: fileType [ 2338.703959] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2338.704376] nova-compute[62208]: DEBUG nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2338.704551] nova-compute[62208]: DEBUG nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2338.704720] nova-compute[62208]: DEBUG nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2338.704881] nova-compute[62208]: DEBUG nova.network.neutron [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2339.053620] nova-compute[62208]: DEBUG nova.network.neutron [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2339.069551] nova-compute[62208]: INFO nova.compute.manager [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Took 0.36 seconds to deallocate network for instance. [ 2339.140969] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2339.176584] nova-compute[62208]: INFO nova.scheduler.client.report [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Deleted allocations for instance 911fbbcc-69d5-479f-87f1-2561fcb3dd6b [ 2339.196417] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2c573800-bd69-4daf-af8f-a9ac78ce94ae tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "911fbbcc-69d5-479f-87f1-2561fcb3dd6b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 531.974s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2339.196676] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8317800a-0822-41b4-be8a-6c4f0746d285 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "911fbbcc-69d5-479f-87f1-2561fcb3dd6b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 335.663s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2339.196891] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8317800a-0822-41b4-be8a-6c4f0746d285 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Acquiring lock "911fbbcc-69d5-479f-87f1-2561fcb3dd6b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2339.197133] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8317800a-0822-41b4-be8a-6c4f0746d285 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "911fbbcc-69d5-479f-87f1-2561fcb3dd6b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2339.197301] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8317800a-0822-41b4-be8a-6c4f0746d285 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "911fbbcc-69d5-479f-87f1-2561fcb3dd6b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2339.199297] nova-compute[62208]: INFO nova.compute.manager [None req-8317800a-0822-41b4-be8a-6c4f0746d285 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Terminating instance [ 2339.201087] nova-compute[62208]: DEBUG nova.compute.manager [None req-8317800a-0822-41b4-be8a-6c4f0746d285 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2339.201276] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8317800a-0822-41b4-be8a-6c4f0746d285 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2339.201810] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c815415b-36a0-43ee-b017-2c5cf6ff3a94 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2339.204513] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2339.204513] nova-compute[62208]: warnings.warn( [ 2339.211568] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-277a36af-4ab4-4d4c-9b65-de373ad6bca2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2339.221914] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2339.221914] nova-compute[62208]: warnings.warn( [ 2339.239224] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-8317800a-0822-41b4-be8a-6c4f0746d285 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 911fbbcc-69d5-479f-87f1-2561fcb3dd6b could not be found. [ 2339.239472] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8317800a-0822-41b4-be8a-6c4f0746d285 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2339.239697] nova-compute[62208]: INFO nova.compute.manager [None req-8317800a-0822-41b4-be8a-6c4f0746d285 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2339.239946] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-8317800a-0822-41b4-be8a-6c4f0746d285 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2339.240184] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2339.240280] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2339.265796] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2339.274091] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 911fbbcc-69d5-479f-87f1-2561fcb3dd6b] Took 0.03 seconds to deallocate network for instance. [ 2339.365411] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8317800a-0822-41b4-be8a-6c4f0746d285 tempest-ImagesTestJSON-226226016 tempest-ImagesTestJSON-226226016-project-member] Lock "911fbbcc-69d5-479f-87f1-2561fcb3dd6b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.169s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2340.141327] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2340.141543] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2340.141696] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances with incomplete migration {{(pid=62208) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11257}} [ 2341.149650] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2341.159994] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2341.160230] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2341.160404] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2341.160570] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2341.161650] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90bfa52f-6e15-43af-816c-5a84259e55b1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.164368] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2341.164368] nova-compute[62208]: warnings.warn( [ 2341.170344] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-628cdd83-ad1c-45c4-94b2-5e0725483455 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.173781] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2341.173781] nova-compute[62208]: warnings.warn( [ 2341.184655] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c944be01-e9d0-48b8-9eb8-f3de4722dee9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.186749] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2341.186749] nova-compute[62208]: warnings.warn( [ 2341.190856] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09a7ef80-c98b-40c6-8760-aae364aa7ee6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.193600] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2341.193600] nova-compute[62208]: warnings.warn( [ 2341.218652] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181953MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2341.218774] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2341.218942] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2341.270026] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b797610-f460-461c-8c5a-1a28cf162c0e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2341.270196] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2341.270330] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec568c91-b110-4c2a-8d62-8127c7781d03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2341.270459] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b63cd2f-0b14-4008-b564-0078d3e0e20a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2341.270601] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 91c64da9-f295-4e84-a8bd-149a72a239da actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2341.270735] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 75ca5bb3-c856-4548-924f-3ff3614b0f63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2341.270922] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2341.271064] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2341.355438] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-189f4f6a-1bbb-4e8f-8b4a-389246dd9713 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.357897] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2341.357897] nova-compute[62208]: warnings.warn( [ 2341.363392] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03433491-da18-4811-bb62-ba0d302ff06a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.366386] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2341.366386] nova-compute[62208]: warnings.warn( [ 2341.394152] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ca28fc8-89d6-41d3-b6e9-e9ef43652c6c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.396594] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2341.396594] nova-compute[62208]: warnings.warn( [ 2341.401898] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ba82451-86b4-4e39-b90a-27c3251dbad1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.405570] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2341.405570] nova-compute[62208]: warnings.warn( [ 2341.414761] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2341.423033] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2341.439913] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2341.440150] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2343.427616] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2343.427616] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2343.427616] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2344.141568] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2346.905414] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "655e577b-5034-4669-8fbd-8495671dd385" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2346.905758] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "655e577b-5034-4669-8fbd-8495671dd385" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2346.918880] nova-compute[62208]: DEBUG nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 2346.974843] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2346.975161] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2346.976717] nova-compute[62208]: INFO nova.compute.claims [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2347.133651] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee801a2e-d7fd-47fc-bf60-f15be84af343 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.136216] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2347.136216] nova-compute[62208]: warnings.warn( [ 2347.141468] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2347.141741] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11219}} [ 2347.143855] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd204012-f38a-4fa3-826e-a853683dbfb8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.147506] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2347.147506] nova-compute[62208]: warnings.warn( [ 2347.151766] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] There are 0 instances to clean {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11228}} [ 2347.178942] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a46418cf-dc96-4ef7-a4f5-ef19f2d1bab5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.181625] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2347.181625] nova-compute[62208]: warnings.warn( [ 2347.187061] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d866dd31-7162-41fd-854e-8070c9dc25c3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.190965] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2347.190965] nova-compute[62208]: warnings.warn( [ 2347.200852] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2347.209134] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2347.227017] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2347.227479] nova-compute[62208]: DEBUG nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 2347.262396] nova-compute[62208]: DEBUG nova.compute.utils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2347.264398] nova-compute[62208]: DEBUG nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2347.264574] nova-compute[62208]: DEBUG nova.network.neutron [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2347.273989] nova-compute[62208]: DEBUG nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 2347.310028] nova-compute[62208]: DEBUG nova.policy [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7534a5a8a37e4451918e35c8b93d4ad5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8eef1e68dea42cf98f03dc8db29498a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 2347.338743] nova-compute[62208]: DEBUG nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 2347.360811] nova-compute[62208]: DEBUG nova.virt.hardware [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2347.361354] nova-compute[62208]: DEBUG nova.virt.hardware [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2347.361632] nova-compute[62208]: DEBUG nova.virt.hardware [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2347.361959] nova-compute[62208]: DEBUG nova.virt.hardware [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2347.362456] nova-compute[62208]: DEBUG nova.virt.hardware [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2347.362725] nova-compute[62208]: DEBUG nova.virt.hardware [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2347.363128] nova-compute[62208]: DEBUG nova.virt.hardware [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2347.363407] nova-compute[62208]: DEBUG nova.virt.hardware [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2347.363753] nova-compute[62208]: DEBUG nova.virt.hardware [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2347.364219] nova-compute[62208]: DEBUG nova.virt.hardware [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2347.364546] nova-compute[62208]: DEBUG nova.virt.hardware [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2347.365869] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ff959b2-a530-45ef-a68d-b07071335c63 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.369120] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2347.369120] nova-compute[62208]: warnings.warn( [ 2347.377053] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23ed7735-71f9-4eed-8ae2-1ab9bb63b65d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.381589] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2347.381589] nova-compute[62208]: warnings.warn( [ 2347.551441] nova-compute[62208]: DEBUG nova.network.neutron [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Successfully created port: 988fa07f-eb04-4e1d-843c-f2f773b9314c {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2348.258857] nova-compute[62208]: DEBUG nova.compute.manager [req-492f1427-5673-4548-bedb-317d0997983e req-9c359143-0332-46ef-9cb1-29f9feb31fe0 service nova] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Received event network-vif-plugged-988fa07f-eb04-4e1d-843c-f2f773b9314c {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2348.259117] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-492f1427-5673-4548-bedb-317d0997983e req-9c359143-0332-46ef-9cb1-29f9feb31fe0 service nova] Acquiring lock "655e577b-5034-4669-8fbd-8495671dd385-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2348.259474] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-492f1427-5673-4548-bedb-317d0997983e req-9c359143-0332-46ef-9cb1-29f9feb31fe0 service nova] Lock "655e577b-5034-4669-8fbd-8495671dd385-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2348.259556] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-492f1427-5673-4548-bedb-317d0997983e req-9c359143-0332-46ef-9cb1-29f9feb31fe0 service nova] Lock "655e577b-5034-4669-8fbd-8495671dd385-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2348.259705] nova-compute[62208]: DEBUG nova.compute.manager [req-492f1427-5673-4548-bedb-317d0997983e req-9c359143-0332-46ef-9cb1-29f9feb31fe0 service nova] [instance: 655e577b-5034-4669-8fbd-8495671dd385] No waiting events found dispatching network-vif-plugged-988fa07f-eb04-4e1d-843c-f2f773b9314c {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2348.259796] nova-compute[62208]: WARNING nova.compute.manager [req-492f1427-5673-4548-bedb-317d0997983e req-9c359143-0332-46ef-9cb1-29f9feb31fe0 service nova] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Received unexpected event network-vif-plugged-988fa07f-eb04-4e1d-843c-f2f773b9314c for instance with vm_state building and task_state spawning. [ 2348.325687] nova-compute[62208]: DEBUG nova.network.neutron [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Successfully updated port: 988fa07f-eb04-4e1d-843c-f2f773b9314c {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2348.338112] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "refresh_cache-655e577b-5034-4669-8fbd-8495671dd385" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2348.338445] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "refresh_cache-655e577b-5034-4669-8fbd-8495671dd385" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2348.338656] nova-compute[62208]: DEBUG nova.network.neutron [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2348.388416] nova-compute[62208]: DEBUG nova.network.neutron [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2348.568705] nova-compute[62208]: DEBUG nova.network.neutron [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Updating instance_info_cache with network_info: [{"id": "988fa07f-eb04-4e1d-843c-f2f773b9314c", "address": "fa:16:3e:8c:72:ae", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap988fa07f-eb", "ovs_interfaceid": "988fa07f-eb04-4e1d-843c-f2f773b9314c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2348.583841] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "refresh_cache-655e577b-5034-4669-8fbd-8495671dd385" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2348.584227] nova-compute[62208]: DEBUG nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Instance network_info: |[{"id": "988fa07f-eb04-4e1d-843c-f2f773b9314c", "address": "fa:16:3e:8c:72:ae", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap988fa07f-eb", "ovs_interfaceid": "988fa07f-eb04-4e1d-843c-f2f773b9314c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 2348.584668] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8c:72:ae', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'da623279-b6f6-4570-8b15-a332120b8b60', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '988fa07f-eb04-4e1d-843c-f2f773b9314c', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2348.592646] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2348.593200] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2348.593445] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-63e47d54-c30d-4dc0-ae2e-ca9512e0f7c1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.608368] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2348.608368] nova-compute[62208]: warnings.warn( [ 2348.614160] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2348.614160] nova-compute[62208]: value = "task-38691" [ 2348.614160] nova-compute[62208]: _type = "Task" [ 2348.614160] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2348.617651] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2348.617651] nova-compute[62208]: warnings.warn( [ 2348.623124] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38691, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2349.120429] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2349.120429] nova-compute[62208]: warnings.warn( [ 2349.127059] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38691, 'name': CreateVM_Task, 'duration_secs': 0.312102} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2349.127229] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2349.127859] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2349.128140] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2349.130989] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95678f2e-fe46-4848-902e-028e09e8e2b9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.141040] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2349.141040] nova-compute[62208]: warnings.warn( [ 2349.162560] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Reconfiguring VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2349.162910] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-547f9c89-2a6f-45cc-bd5f-4189ba3bff14 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.173003] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2349.173003] nova-compute[62208]: warnings.warn( [ 2349.179786] nova-compute[62208]: DEBUG oslo_vmware.api [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2349.179786] nova-compute[62208]: value = "task-38692" [ 2349.179786] nova-compute[62208]: _type = "Task" [ 2349.179786] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2349.182748] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2349.182748] nova-compute[62208]: warnings.warn( [ 2349.188016] nova-compute[62208]: DEBUG oslo_vmware.api [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38692, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2349.684067] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2349.684067] nova-compute[62208]: warnings.warn( [ 2349.690156] nova-compute[62208]: DEBUG oslo_vmware.api [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38692, 'name': ReconfigVM_Task, 'duration_secs': 0.107895} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2349.690445] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Reconfigured VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2349.690685] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.563s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2349.690951] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2349.691103] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2349.691443] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2349.691711] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-05f517f5-7f20-44d3-9c4c-dbaae7464e1c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.693308] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2349.693308] nova-compute[62208]: warnings.warn( [ 2349.696523] nova-compute[62208]: DEBUG oslo_vmware.api [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2349.696523] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52629579-396b-a0ae-73ea-41836461f9a6" [ 2349.696523] nova-compute[62208]: _type = "Task" [ 2349.696523] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2349.699502] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2349.699502] nova-compute[62208]: warnings.warn( [ 2349.704437] nova-compute[62208]: DEBUG oslo_vmware.api [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52629579-396b-a0ae-73ea-41836461f9a6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2350.201779] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2350.201779] nova-compute[62208]: warnings.warn( [ 2350.208746] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2350.209035] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2350.209252] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2350.283895] nova-compute[62208]: DEBUG nova.compute.manager [req-2df42c90-1746-4994-8f75-30c0b3751ac3 req-bea572ea-a53c-4f80-b251-04950c403442 service nova] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Received event network-changed-988fa07f-eb04-4e1d-843c-f2f773b9314c {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2350.283895] nova-compute[62208]: DEBUG nova.compute.manager [req-2df42c90-1746-4994-8f75-30c0b3751ac3 req-bea572ea-a53c-4f80-b251-04950c403442 service nova] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Refreshing instance network info cache due to event network-changed-988fa07f-eb04-4e1d-843c-f2f773b9314c. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 2350.284156] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2df42c90-1746-4994-8f75-30c0b3751ac3 req-bea572ea-a53c-4f80-b251-04950c403442 service nova] Acquiring lock "refresh_cache-655e577b-5034-4669-8fbd-8495671dd385" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2350.284314] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2df42c90-1746-4994-8f75-30c0b3751ac3 req-bea572ea-a53c-4f80-b251-04950c403442 service nova] Acquired lock "refresh_cache-655e577b-5034-4669-8fbd-8495671dd385" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2350.284497] nova-compute[62208]: DEBUG nova.network.neutron [req-2df42c90-1746-4994-8f75-30c0b3751ac3 req-bea572ea-a53c-4f80-b251-04950c403442 service nova] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Refreshing network info cache for port 988fa07f-eb04-4e1d-843c-f2f773b9314c {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2350.508056] nova-compute[62208]: DEBUG nova.network.neutron [req-2df42c90-1746-4994-8f75-30c0b3751ac3 req-bea572ea-a53c-4f80-b251-04950c403442 service nova] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Updated VIF entry in instance network info cache for port 988fa07f-eb04-4e1d-843c-f2f773b9314c. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2350.508474] nova-compute[62208]: DEBUG nova.network.neutron [req-2df42c90-1746-4994-8f75-30c0b3751ac3 req-bea572ea-a53c-4f80-b251-04950c403442 service nova] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Updating instance_info_cache with network_info: [{"id": "988fa07f-eb04-4e1d-843c-f2f773b9314c", "address": "fa:16:3e:8c:72:ae", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap988fa07f-eb", "ovs_interfaceid": "988fa07f-eb04-4e1d-843c-f2f773b9314c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2350.517984] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-2df42c90-1746-4994-8f75-30c0b3751ac3 req-bea572ea-a53c-4f80-b251-04950c403442 service nova] Releasing lock "refresh_cache-655e577b-5034-4669-8fbd-8495671dd385" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2354.173217] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2368.141555] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2374.795071] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2374.795587] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Getting list of instances from cluster (obj){ [ 2374.795587] nova-compute[62208]: value = "domain-c8" [ 2374.795587] nova-compute[62208]: _type = "ClusterComputeResource" [ 2374.795587] nova-compute[62208]: } {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2374.796657] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8e17743-6afa-4726-b720-52ba69d9490f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2374.800634] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2374.800634] nova-compute[62208]: warnings.warn( [ 2374.813511] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Got total of 7 instances {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2385.910203] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2385.910203] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2385.910203] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2385.910203] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2385.910203] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2385.910203] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2385.910203] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2385.910203] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2385.910203] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2385.910203] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2385.910203] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2385.910203] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2385.910828] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/a5b0fcb2-72fc-45fd-afd6-b9fc80fe9b1a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2385.912599] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2385.912828] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Copying Virtual Disk [datastore2] vmware_temp/a5b0fcb2-72fc-45fd-afd6-b9fc80fe9b1a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/a5b0fcb2-72fc-45fd-afd6-b9fc80fe9b1a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2385.913111] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6a0ccc0f-ff4d-449d-a378-c9c9fd230aa0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2385.915582] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2385.915582] nova-compute[62208]: warnings.warn( [ 2385.922534] nova-compute[62208]: DEBUG oslo_vmware.api [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 2385.922534] nova-compute[62208]: value = "task-38693" [ 2385.922534] nova-compute[62208]: _type = "Task" [ 2385.922534] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2385.925978] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2385.925978] nova-compute[62208]: warnings.warn( [ 2385.931461] nova-compute[62208]: DEBUG oslo_vmware.api [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38693, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2386.427206] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2386.427206] nova-compute[62208]: warnings.warn( [ 2386.433422] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2386.437259] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2386.437259] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Traceback (most recent call last): [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] yield resources [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] self.driver.spawn(context, instance, image_meta, [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] self._fetch_image_if_missing(context, vi) [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] image_cache(vi, tmp_image_ds_loc) [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] vm_util.copy_virtual_disk( [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] session._wait_for_task(vmdk_copy_task) [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] return self.wait_for_task(task_ref) [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] return evt.wait() [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] result = hub.switch() [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] return self.greenlet.switch() [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] self.f(*self.args, **self.kw) [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] raise exceptions.translate_fault(task_info.error) [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Faults: ['InvalidArgument'] [ 2386.437259] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] [ 2386.437259] nova-compute[62208]: INFO nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Terminating instance [ 2386.438402] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2386.438402] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2386.438402] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3f0bc29d-6b5d-475a-bfdd-d487f1cd9b47 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2386.439166] nova-compute[62208]: DEBUG nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2386.439399] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2386.440197] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8f0377b-28b9-496f-a501-9c68b107d78f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2386.442793] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2386.442793] nova-compute[62208]: warnings.warn( [ 2386.443170] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2386.443170] nova-compute[62208]: warnings.warn( [ 2386.447965] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2386.448275] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-71003e3b-ee51-4fad-86af-c7c6a1101937 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2386.450980] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2386.451170] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2386.451755] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2386.451755] nova-compute[62208]: warnings.warn( [ 2386.452202] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-38e3ff6e-bf31-4d58-aacc-edd6b10b08e9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2386.454139] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2386.454139] nova-compute[62208]: warnings.warn( [ 2386.459172] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2386.459172] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52c26686-a853-354b-8e4f-4277f50c8435" [ 2386.459172] nova-compute[62208]: _type = "Task" [ 2386.459172] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2386.462555] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2386.462555] nova-compute[62208]: warnings.warn( [ 2386.469768] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52c26686-a853-354b-8e4f-4277f50c8435, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2386.521751] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2386.523122] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2386.523122] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Deleting the datastore file [datastore2] 5b797610-f460-461c-8c5a-1a28cf162c0e {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2386.523122] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cd7f4f87-efd6-4426-b709-fecf5c52c271 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2386.524402] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2386.524402] nova-compute[62208]: warnings.warn( [ 2386.530901] nova-compute[62208]: DEBUG oslo_vmware.api [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 2386.530901] nova-compute[62208]: value = "task-38695" [ 2386.530901] nova-compute[62208]: _type = "Task" [ 2386.530901] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2386.534290] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2386.534290] nova-compute[62208]: warnings.warn( [ 2386.541741] nova-compute[62208]: DEBUG oslo_vmware.api [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38695, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2386.963839] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2386.963839] nova-compute[62208]: warnings.warn( [ 2386.973783] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2386.974043] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating directory with path [datastore2] vmware_temp/db05a386-bbe1-4780-9286-9e8fcabbe668/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2386.974284] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3679d3b1-3bb9-44b1-9682-22ed925b3af9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2386.976208] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2386.976208] nova-compute[62208]: warnings.warn( [ 2386.986468] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Created directory with path [datastore2] vmware_temp/db05a386-bbe1-4780-9286-9e8fcabbe668/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2386.986734] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Fetch image to [datastore2] vmware_temp/db05a386-bbe1-4780-9286-9e8fcabbe668/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2386.987155] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/db05a386-bbe1-4780-9286-9e8fcabbe668/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2386.988042] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-414248a9-d6c2-4a94-96cc-19834dfb1192 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2386.990644] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2386.990644] nova-compute[62208]: warnings.warn( [ 2386.995595] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3457c24d-2e40-4d93-84de-f87460b53136 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2386.998452] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2386.998452] nova-compute[62208]: warnings.warn( [ 2387.005859] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14c4f2c0-5939-413e-8375-d8f27c0f2529 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2387.011276] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2387.011276] nova-compute[62208]: warnings.warn( [ 2387.052025] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7938b70d-cf52-4c6e-ace9-7ed91014fa9e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2387.054328] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2387.054328] nova-compute[62208]: warnings.warn( [ 2387.054923] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2387.054923] nova-compute[62208]: warnings.warn( [ 2387.061534] nova-compute[62208]: DEBUG oslo_vmware.api [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38695, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079666} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2387.064035] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2387.064244] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2387.064471] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2387.064684] nova-compute[62208]: INFO nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Took 0.63 seconds to destroy the instance on the hypervisor. [ 2387.068286] nova-compute[62208]: DEBUG nova.compute.claims [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Aborting claim: <nova.compute.claims.Claim object at 0x7fb935db6f50> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2387.068518] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2387.068941] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2387.072437] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-50cc3b74-a8f5-4061-a5b0-59ae4a0436a3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2387.074554] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2387.074554] nova-compute[62208]: warnings.warn( [ 2387.102656] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2387.174146] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/db05a386-bbe1-4780-9286-9e8fcabbe668/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2387.232841] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2387.232964] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/db05a386-bbe1-4780-9286-9e8fcabbe668/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2387.303929] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c8300f8-c1f4-4b77-9248-aba512417b19 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2387.308615] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2387.308615] nova-compute[62208]: warnings.warn( [ 2387.315036] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f778e6d0-a39d-40d0-8981-2b95e0666643 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2387.320305] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2387.320305] nova-compute[62208]: warnings.warn( [ 2387.354271] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb3b84f4-6eab-4f4b-9366-b46002198f58 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2387.358586] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2387.358586] nova-compute[62208]: warnings.warn( [ 2387.364483] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6187cd3-0292-40d1-a43e-eaa3b85b3ef3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2387.368666] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2387.368666] nova-compute[62208]: warnings.warn( [ 2387.379481] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2387.389462] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2387.407900] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.339s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2387.408921] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Traceback (most recent call last): [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] self.driver.spawn(context, instance, image_meta, [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] self._fetch_image_if_missing(context, vi) [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] image_cache(vi, tmp_image_ds_loc) [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] vm_util.copy_virtual_disk( [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] session._wait_for_task(vmdk_copy_task) [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] return self.wait_for_task(task_ref) [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] return evt.wait() [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] result = hub.switch() [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] return self.greenlet.switch() [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] self.f(*self.args, **self.kw) [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] raise exceptions.translate_fault(task_info.error) [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Faults: ['InvalidArgument'] [ 2387.408921] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] [ 2387.410622] nova-compute[62208]: DEBUG nova.compute.utils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2387.412340] nova-compute[62208]: DEBUG nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Build of instance 5b797610-f460-461c-8c5a-1a28cf162c0e was re-scheduled: A specified parameter was not correct: fileType [ 2387.412340] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2387.412888] nova-compute[62208]: DEBUG nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2387.413207] nova-compute[62208]: DEBUG nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2387.413494] nova-compute[62208]: DEBUG nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2387.413767] nova-compute[62208]: DEBUG nova.network.neutron [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2387.747307] nova-compute[62208]: DEBUG nova.network.neutron [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2387.764868] nova-compute[62208]: INFO nova.compute.manager [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Took 0.35 seconds to deallocate network for instance. [ 2387.877794] nova-compute[62208]: INFO nova.scheduler.client.report [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Deleted allocations for instance 5b797610-f460-461c-8c5a-1a28cf162c0e [ 2387.902568] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-70559087-2c80-4792-83f4-899e5bf27ed8 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "5b797610-f460-461c-8c5a-1a28cf162c0e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 533.993s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2387.902857] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a20f2a07-0670-416f-a296-6c698267d81e tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "5b797610-f460-461c-8c5a-1a28cf162c0e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 337.527s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2387.903086] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a20f2a07-0670-416f-a296-6c698267d81e tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "5b797610-f460-461c-8c5a-1a28cf162c0e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2387.903290] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a20f2a07-0670-416f-a296-6c698267d81e tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "5b797610-f460-461c-8c5a-1a28cf162c0e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2387.903452] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a20f2a07-0670-416f-a296-6c698267d81e tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "5b797610-f460-461c-8c5a-1a28cf162c0e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2387.906664] nova-compute[62208]: INFO nova.compute.manager [None req-a20f2a07-0670-416f-a296-6c698267d81e tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Terminating instance [ 2387.909569] nova-compute[62208]: DEBUG nova.compute.manager [None req-a20f2a07-0670-416f-a296-6c698267d81e tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2387.909771] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a20f2a07-0670-416f-a296-6c698267d81e tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2387.910369] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-084637f2-bebf-432b-bb54-74ecb085e901 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2387.913741] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2387.913741] nova-compute[62208]: warnings.warn( [ 2387.923211] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e87b691c-8693-4691-be7e-70e747586bb0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2387.933978] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2387.933978] nova-compute[62208]: warnings.warn( [ 2387.951839] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-a20f2a07-0670-416f-a296-6c698267d81e tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5b797610-f460-461c-8c5a-1a28cf162c0e could not be found. [ 2387.952080] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-a20f2a07-0670-416f-a296-6c698267d81e tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2387.952292] nova-compute[62208]: INFO nova.compute.manager [None req-a20f2a07-0670-416f-a296-6c698267d81e tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2387.952542] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-a20f2a07-0670-416f-a296-6c698267d81e tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2387.952766] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2387.952855] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2387.992170] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2388.002360] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 5b797610-f460-461c-8c5a-1a28cf162c0e] Took 0.05 seconds to deallocate network for instance. [ 2388.155109] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-a20f2a07-0670-416f-a296-6c698267d81e tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "5b797610-f460-461c-8c5a-1a28cf162c0e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.252s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2390.082258] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Acquiring lock "e5ed059f-0390-480e-bafe-17092f272131" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2390.082951] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Lock "e5ed059f-0390-480e-bafe-17092f272131" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2390.098877] nova-compute[62208]: DEBUG nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 2390.156821] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2390.157236] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2390.158833] nova-compute[62208]: INFO nova.compute.claims [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2390.313875] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-903cc67e-162c-413a-a5ef-43384c2aad33 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.316985] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2390.316985] nova-compute[62208]: warnings.warn( [ 2390.322725] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ba338fd-c9c8-411b-8da0-53f0460f7697 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.327093] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2390.327093] nova-compute[62208]: warnings.warn( [ 2390.356195] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58fceafc-f47b-4a8c-bcf5-a90ce746fb98 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.358671] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2390.358671] nova-compute[62208]: warnings.warn( [ 2390.364325] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c79eb37-c210-4115-96f9-255f5e9bd540 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.368416] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2390.368416] nova-compute[62208]: warnings.warn( [ 2390.381210] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2390.398662] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2390.420020] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2390.420520] nova-compute[62208]: DEBUG nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 2390.473523] nova-compute[62208]: DEBUG nova.compute.utils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2390.474807] nova-compute[62208]: DEBUG nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2390.474971] nova-compute[62208]: DEBUG nova.network.neutron [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2390.487905] nova-compute[62208]: DEBUG nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 2390.529325] nova-compute[62208]: DEBUG nova.policy [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '83e217febb9d4b3aa797fdfb68ca09e1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b50fdece9ddf4b119e5f4f8ca3f4f16c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 2390.579013] nova-compute[62208]: DEBUG nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 2390.601580] nova-compute[62208]: DEBUG nova.virt.hardware [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2390.601949] nova-compute[62208]: DEBUG nova.virt.hardware [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2390.602128] nova-compute[62208]: DEBUG nova.virt.hardware [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2390.602318] nova-compute[62208]: DEBUG nova.virt.hardware [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2390.602542] nova-compute[62208]: DEBUG nova.virt.hardware [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2390.602756] nova-compute[62208]: DEBUG nova.virt.hardware [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2390.602981] nova-compute[62208]: DEBUG nova.virt.hardware [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2390.603146] nova-compute[62208]: DEBUG nova.virt.hardware [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2390.603319] nova-compute[62208]: DEBUG nova.virt.hardware [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2390.603483] nova-compute[62208]: DEBUG nova.virt.hardware [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2390.603654] nova-compute[62208]: DEBUG nova.virt.hardware [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2390.604781] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fcd03cb-1b2e-4720-a61f-112719371ef3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.607310] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2390.607310] nova-compute[62208]: warnings.warn( [ 2390.613776] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0684a944-0f73-4adf-9db0-57208ab6d61e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.617380] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2390.617380] nova-compute[62208]: warnings.warn( [ 2390.801919] nova-compute[62208]: DEBUG nova.network.neutron [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Successfully created port: b29c06d2-31e1-4987-a3d5-904f79182785 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2391.306482] nova-compute[62208]: DEBUG nova.compute.manager [req-7911ddf6-6330-4b6a-8f81-4801d793e6c7 req-0af7c4e3-d39c-4305-acf9-84dd0a0a3faa service nova] [instance: e5ed059f-0390-480e-bafe-17092f272131] Received event network-vif-plugged-b29c06d2-31e1-4987-a3d5-904f79182785 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2391.306482] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-7911ddf6-6330-4b6a-8f81-4801d793e6c7 req-0af7c4e3-d39c-4305-acf9-84dd0a0a3faa service nova] Acquiring lock "e5ed059f-0390-480e-bafe-17092f272131-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2391.306482] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-7911ddf6-6330-4b6a-8f81-4801d793e6c7 req-0af7c4e3-d39c-4305-acf9-84dd0a0a3faa service nova] Lock "e5ed059f-0390-480e-bafe-17092f272131-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2391.306482] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-7911ddf6-6330-4b6a-8f81-4801d793e6c7 req-0af7c4e3-d39c-4305-acf9-84dd0a0a3faa service nova] Lock "e5ed059f-0390-480e-bafe-17092f272131-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2391.306482] nova-compute[62208]: DEBUG nova.compute.manager [req-7911ddf6-6330-4b6a-8f81-4801d793e6c7 req-0af7c4e3-d39c-4305-acf9-84dd0a0a3faa service nova] [instance: e5ed059f-0390-480e-bafe-17092f272131] No waiting events found dispatching network-vif-plugged-b29c06d2-31e1-4987-a3d5-904f79182785 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2391.306482] nova-compute[62208]: WARNING nova.compute.manager [req-7911ddf6-6330-4b6a-8f81-4801d793e6c7 req-0af7c4e3-d39c-4305-acf9-84dd0a0a3faa service nova] [instance: e5ed059f-0390-480e-bafe-17092f272131] Received unexpected event network-vif-plugged-b29c06d2-31e1-4987-a3d5-904f79182785 for instance with vm_state building and task_state spawning. [ 2391.379792] nova-compute[62208]: DEBUG nova.network.neutron [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Successfully updated port: b29c06d2-31e1-4987-a3d5-904f79182785 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2391.392182] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Acquiring lock "refresh_cache-e5ed059f-0390-480e-bafe-17092f272131" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2391.392182] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Acquired lock "refresh_cache-e5ed059f-0390-480e-bafe-17092f272131" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2391.392182] nova-compute[62208]: DEBUG nova.network.neutron [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2391.436314] nova-compute[62208]: DEBUG nova.network.neutron [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2391.654848] nova-compute[62208]: DEBUG nova.network.neutron [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Updating instance_info_cache with network_info: [{"id": "b29c06d2-31e1-4987-a3d5-904f79182785", "address": "fa:16:3e:3d:bb:91", "network": {"id": "cbda0e2b-397f-4dcf-8db1-578e267d61e2", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-153411038-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "b50fdece9ddf4b119e5f4f8ca3f4f16c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4df917f7-847a-4c0e-b0e3-69a52e4a1554", "external-id": "cl2-zone-457", "segmentation_id": 457, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb29c06d2-31", "ovs_interfaceid": "b29c06d2-31e1-4987-a3d5-904f79182785", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2391.671003] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Releasing lock "refresh_cache-e5ed059f-0390-480e-bafe-17092f272131" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2391.671841] nova-compute[62208]: DEBUG nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Instance network_info: |[{"id": "b29c06d2-31e1-4987-a3d5-904f79182785", "address": "fa:16:3e:3d:bb:91", "network": {"id": "cbda0e2b-397f-4dcf-8db1-578e267d61e2", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-153411038-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "b50fdece9ddf4b119e5f4f8ca3f4f16c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4df917f7-847a-4c0e-b0e3-69a52e4a1554", "external-id": "cl2-zone-457", "segmentation_id": 457, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb29c06d2-31", "ovs_interfaceid": "b29c06d2-31e1-4987-a3d5-904f79182785", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 2391.672856] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3d:bb:91', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4df917f7-847a-4c0e-b0e3-69a52e4a1554', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b29c06d2-31e1-4987-a3d5-904f79182785', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2391.680832] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Creating folder: Project (b50fdece9ddf4b119e5f4f8ca3f4f16c). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2391.681597] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c4c1e32b-efaf-4c0b-a923-ac7ce97611ea {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.683895] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2391.683895] nova-compute[62208]: warnings.warn( [ 2391.692985] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Created folder: Project (b50fdece9ddf4b119e5f4f8ca3f4f16c) in parent group-v17427. [ 2391.693458] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Creating folder: Instances. Parent ref: group-v17567. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2391.693842] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7482484d-a8f1-47f8-a4aa-60cf6210fc0e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.696095] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2391.696095] nova-compute[62208]: warnings.warn( [ 2391.704637] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Created folder: Instances in parent group-v17567. [ 2391.705144] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2391.705603] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e5ed059f-0390-480e-bafe-17092f272131] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2391.705970] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ed1edd09-9eeb-493d-8a8e-13b843650271 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2391.722515] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2391.722515] nova-compute[62208]: warnings.warn( [ 2391.728185] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2391.728185] nova-compute[62208]: value = "task-38698" [ 2391.728185] nova-compute[62208]: _type = "Task" [ 2391.728185] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2391.731972] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2391.731972] nova-compute[62208]: warnings.warn( [ 2391.737987] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38698, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2392.234458] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2392.234458] nova-compute[62208]: warnings.warn( [ 2392.241729] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38698, 'name': CreateVM_Task, 'duration_secs': 0.302724} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2392.241854] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e5ed059f-0390-480e-bafe-17092f272131] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2392.242459] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2392.242698] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2392.245485] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0827c83-8a04-449d-9fcd-76f7ead76714 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2392.255401] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2392.255401] nova-compute[62208]: warnings.warn( [ 2392.275962] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Reconfiguring VM instance to enable vnc on port - 5906 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2392.276369] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-92d30a5d-d90d-469e-a1a7-b6a79fc7318b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2392.286174] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2392.286174] nova-compute[62208]: warnings.warn( [ 2392.291897] nova-compute[62208]: DEBUG oslo_vmware.api [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Waiting for the task: (returnval){ [ 2392.291897] nova-compute[62208]: value = "task-38699" [ 2392.291897] nova-compute[62208]: _type = "Task" [ 2392.291897] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2392.296119] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2392.296119] nova-compute[62208]: warnings.warn( [ 2392.303615] nova-compute[62208]: DEBUG oslo_vmware.api [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Task: {'id': task-38699, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2392.795695] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2392.795695] nova-compute[62208]: warnings.warn( [ 2392.801388] nova-compute[62208]: DEBUG oslo_vmware.api [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Task: {'id': task-38699, 'name': ReconfigVM_Task, 'duration_secs': 0.101853} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2392.801664] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Reconfigured VM instance to enable vnc on port - 5906 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2392.801888] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.559s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2392.802136] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2392.802286] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2392.802646] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2392.802906] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2d7d90bf-1231-45d0-8364-b3deae90d7e5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2392.804399] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2392.804399] nova-compute[62208]: warnings.warn( [ 2392.807376] nova-compute[62208]: DEBUG oslo_vmware.api [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Waiting for the task: (returnval){ [ 2392.807376] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e2201f-a380-ae57-d8d4-624107e7e52b" [ 2392.807376] nova-compute[62208]: _type = "Task" [ 2392.807376] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2392.810199] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2392.810199] nova-compute[62208]: warnings.warn( [ 2392.815097] nova-compute[62208]: DEBUG oslo_vmware.api [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e2201f-a380-ae57-d8d4-624107e7e52b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2393.175102] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2393.311964] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2393.311964] nova-compute[62208]: warnings.warn( [ 2393.318446] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2393.318716] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2393.318961] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2393.501379] nova-compute[62208]: DEBUG nova.compute.manager [req-e07c7bdf-616b-41ea-809e-dd0d069b084c req-dd94effa-d1b1-4f4c-bd5f-ef9b0a9b88ec service nova] [instance: e5ed059f-0390-480e-bafe-17092f272131] Received event network-changed-b29c06d2-31e1-4987-a3d5-904f79182785 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2393.501607] nova-compute[62208]: DEBUG nova.compute.manager [req-e07c7bdf-616b-41ea-809e-dd0d069b084c req-dd94effa-d1b1-4f4c-bd5f-ef9b0a9b88ec service nova] [instance: e5ed059f-0390-480e-bafe-17092f272131] Refreshing instance network info cache due to event network-changed-b29c06d2-31e1-4987-a3d5-904f79182785. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 2393.501771] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-e07c7bdf-616b-41ea-809e-dd0d069b084c req-dd94effa-d1b1-4f4c-bd5f-ef9b0a9b88ec service nova] Acquiring lock "refresh_cache-e5ed059f-0390-480e-bafe-17092f272131" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2393.501992] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-e07c7bdf-616b-41ea-809e-dd0d069b084c req-dd94effa-d1b1-4f4c-bd5f-ef9b0a9b88ec service nova] Acquired lock "refresh_cache-e5ed059f-0390-480e-bafe-17092f272131" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2393.502065] nova-compute[62208]: DEBUG nova.network.neutron [req-e07c7bdf-616b-41ea-809e-dd0d069b084c req-dd94effa-d1b1-4f4c-bd5f-ef9b0a9b88ec service nova] [instance: e5ed059f-0390-480e-bafe-17092f272131] Refreshing network info cache for port b29c06d2-31e1-4987-a3d5-904f79182785 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2393.735893] nova-compute[62208]: DEBUG nova.network.neutron [req-e07c7bdf-616b-41ea-809e-dd0d069b084c req-dd94effa-d1b1-4f4c-bd5f-ef9b0a9b88ec service nova] [instance: e5ed059f-0390-480e-bafe-17092f272131] Updated VIF entry in instance network info cache for port b29c06d2-31e1-4987-a3d5-904f79182785. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2393.736272] nova-compute[62208]: DEBUG nova.network.neutron [req-e07c7bdf-616b-41ea-809e-dd0d069b084c req-dd94effa-d1b1-4f4c-bd5f-ef9b0a9b88ec service nova] [instance: e5ed059f-0390-480e-bafe-17092f272131] Updating instance_info_cache with network_info: [{"id": "b29c06d2-31e1-4987-a3d5-904f79182785", "address": "fa:16:3e:3d:bb:91", "network": {"id": "cbda0e2b-397f-4dcf-8db1-578e267d61e2", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-153411038-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "b50fdece9ddf4b119e5f4f8ca3f4f16c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4df917f7-847a-4c0e-b0e3-69a52e4a1554", "external-id": "cl2-zone-457", "segmentation_id": 457, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb29c06d2-31", "ovs_interfaceid": "b29c06d2-31e1-4987-a3d5-904f79182785", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2393.745614] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-e07c7bdf-616b-41ea-809e-dd0d069b084c req-dd94effa-d1b1-4f4c-bd5f-ef9b0a9b88ec service nova] Releasing lock "refresh_cache-e5ed059f-0390-480e-bafe-17092f272131" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2395.150739] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "c5f6249c-4435-4aad-99ce-f426c042a24a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2395.151054] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "c5f6249c-4435-4aad-99ce-f426c042a24a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2395.162538] nova-compute[62208]: DEBUG nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 2395.216937] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2395.217171] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2395.220601] nova-compute[62208]: INFO nova.compute.claims [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2395.372917] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9923d49-9133-4991-b548-a01f8cc2c39d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2395.375724] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2395.375724] nova-compute[62208]: warnings.warn( [ 2395.381084] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55d010a3-8b63-4ad3-a5bd-3157185b3d22 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2395.384108] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2395.384108] nova-compute[62208]: warnings.warn( [ 2395.411564] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d9d59ff-6ba9-496d-9c48-a3ad896fe16c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2395.414157] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2395.414157] nova-compute[62208]: warnings.warn( [ 2395.419609] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bface0ed-a68f-4ef7-94a7-3438fab15040 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2395.424697] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2395.424697] nova-compute[62208]: warnings.warn( [ 2395.434937] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2395.445387] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2395.462605] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.245s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2395.463093] nova-compute[62208]: DEBUG nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 2395.498735] nova-compute[62208]: DEBUG nova.compute.utils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2395.500237] nova-compute[62208]: DEBUG nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2395.500413] nova-compute[62208]: DEBUG nova.network.neutron [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2395.519182] nova-compute[62208]: DEBUG nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 2395.546186] nova-compute[62208]: DEBUG nova.policy [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e8cb00a6413b46fcb17cbe532a0bffc5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '53b578fa6aa34a2d80eb9938d58ffe12', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 2395.589787] nova-compute[62208]: DEBUG nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 2395.613921] nova-compute[62208]: DEBUG nova.virt.hardware [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2395.614213] nova-compute[62208]: DEBUG nova.virt.hardware [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2395.614420] nova-compute[62208]: DEBUG nova.virt.hardware [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2395.614637] nova-compute[62208]: DEBUG nova.virt.hardware [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2395.614815] nova-compute[62208]: DEBUG nova.virt.hardware [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2395.614991] nova-compute[62208]: DEBUG nova.virt.hardware [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2395.615229] nova-compute[62208]: DEBUG nova.virt.hardware [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2395.615462] nova-compute[62208]: DEBUG nova.virt.hardware [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2395.615695] nova-compute[62208]: DEBUG nova.virt.hardware [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2395.615900] nova-compute[62208]: DEBUG nova.virt.hardware [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2395.616134] nova-compute[62208]: DEBUG nova.virt.hardware [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2395.617013] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07e247b6-708a-4996-80da-58719ed8a580 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2395.619667] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2395.619667] nova-compute[62208]: warnings.warn( [ 2395.625649] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eddcd821-c4a8-4693-a679-7b28ce060e89 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2395.629698] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2395.629698] nova-compute[62208]: warnings.warn( [ 2395.863578] nova-compute[62208]: DEBUG nova.network.neutron [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Successfully created port: 5d86209a-a315-4be5-a3b4-3a6fa932d70a {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2396.373893] nova-compute[62208]: DEBUG nova.compute.manager [req-814b96d8-f819-40cf-bd93-f87c6a430fcc req-2ab04d9a-bebe-4a34-9bf6-e6283fb07a8d service nova] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Received event network-vif-plugged-5d86209a-a315-4be5-a3b4-3a6fa932d70a {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2396.374246] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-814b96d8-f819-40cf-bd93-f87c6a430fcc req-2ab04d9a-bebe-4a34-9bf6-e6283fb07a8d service nova] Acquiring lock "c5f6249c-4435-4aad-99ce-f426c042a24a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2396.374561] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-814b96d8-f819-40cf-bd93-f87c6a430fcc req-2ab04d9a-bebe-4a34-9bf6-e6283fb07a8d service nova] Lock "c5f6249c-4435-4aad-99ce-f426c042a24a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2396.374843] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-814b96d8-f819-40cf-bd93-f87c6a430fcc req-2ab04d9a-bebe-4a34-9bf6-e6283fb07a8d service nova] Lock "c5f6249c-4435-4aad-99ce-f426c042a24a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2396.375079] nova-compute[62208]: DEBUG nova.compute.manager [req-814b96d8-f819-40cf-bd93-f87c6a430fcc req-2ab04d9a-bebe-4a34-9bf6-e6283fb07a8d service nova] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] No waiting events found dispatching network-vif-plugged-5d86209a-a315-4be5-a3b4-3a6fa932d70a {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2396.375352] nova-compute[62208]: WARNING nova.compute.manager [req-814b96d8-f819-40cf-bd93-f87c6a430fcc req-2ab04d9a-bebe-4a34-9bf6-e6283fb07a8d service nova] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Received unexpected event network-vif-plugged-5d86209a-a315-4be5-a3b4-3a6fa932d70a for instance with vm_state building and task_state spawning. [ 2396.447234] nova-compute[62208]: DEBUG nova.network.neutron [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Successfully updated port: 5d86209a-a315-4be5-a3b4-3a6fa932d70a {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2396.458065] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "refresh_cache-c5f6249c-4435-4aad-99ce-f426c042a24a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2396.458216] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "refresh_cache-c5f6249c-4435-4aad-99ce-f426c042a24a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2396.458431] nova-compute[62208]: DEBUG nova.network.neutron [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2396.498178] nova-compute[62208]: DEBUG nova.network.neutron [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2396.712797] nova-compute[62208]: DEBUG nova.network.neutron [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Updating instance_info_cache with network_info: [{"id": "5d86209a-a315-4be5-a3b4-3a6fa932d70a", "address": "fa:16:3e:41:2a:a5", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5d86209a-a3", "ovs_interfaceid": "5d86209a-a315-4be5-a3b4-3a6fa932d70a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2396.726289] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "refresh_cache-c5f6249c-4435-4aad-99ce-f426c042a24a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2396.726595] nova-compute[62208]: DEBUG nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Instance network_info: |[{"id": "5d86209a-a315-4be5-a3b4-3a6fa932d70a", "address": "fa:16:3e:41:2a:a5", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5d86209a-a3", "ovs_interfaceid": "5d86209a-a315-4be5-a3b4-3a6fa932d70a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 2396.727019] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:41:2a:a5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4b5f9472-1844-4c99-8804-8f193cfff562', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5d86209a-a315-4be5-a3b4-3a6fa932d70a', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2396.734478] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2396.734993] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2396.735265] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1b82dd28-d1b3-4823-993b-6a037045e3db {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2396.749721] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2396.749721] nova-compute[62208]: warnings.warn( [ 2396.756076] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2396.756076] nova-compute[62208]: value = "task-38700" [ 2396.756076] nova-compute[62208]: _type = "Task" [ 2396.756076] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2396.759704] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2396.759704] nova-compute[62208]: warnings.warn( [ 2396.765440] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38700, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2397.140615] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2397.260376] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2397.260376] nova-compute[62208]: warnings.warn( [ 2397.266940] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38700, 'name': CreateVM_Task, 'duration_secs': 0.301277} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2397.267326] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2397.268085] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2397.268594] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2397.271547] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ccb970b-ba4d-481c-a8ce-c1ea15f29071 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2397.281801] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2397.281801] nova-compute[62208]: warnings.warn( [ 2397.303254] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Reconfiguring VM instance to enable vnc on port - 5907 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2397.303764] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-506bae81-d2a0-437f-bfd5-599674f79b1f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2397.313874] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2397.313874] nova-compute[62208]: warnings.warn( [ 2397.320304] nova-compute[62208]: DEBUG oslo_vmware.api [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2397.320304] nova-compute[62208]: value = "task-38701" [ 2397.320304] nova-compute[62208]: _type = "Task" [ 2397.320304] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2397.323527] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2397.323527] nova-compute[62208]: warnings.warn( [ 2397.330580] nova-compute[62208]: DEBUG oslo_vmware.api [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38701, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2397.824809] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2397.824809] nova-compute[62208]: warnings.warn( [ 2397.831153] nova-compute[62208]: DEBUG oslo_vmware.api [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38701, 'name': ReconfigVM_Task, 'duration_secs': 0.114212} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2397.831597] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Reconfigured VM instance to enable vnc on port - 5907 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2397.831921] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.563s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2397.832307] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2397.832601] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2397.833005] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2397.833356] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b5bddd96-7ee2-4898-9726-76730fb2f8f8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2397.834984] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2397.834984] nova-compute[62208]: warnings.warn( [ 2397.838342] nova-compute[62208]: DEBUG oslo_vmware.api [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2397.838342] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f594bc-1f27-f1ab-6f0f-4e1d7e5b78e2" [ 2397.838342] nova-compute[62208]: _type = "Task" [ 2397.838342] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2397.841401] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2397.841401] nova-compute[62208]: warnings.warn( [ 2397.846349] nova-compute[62208]: DEBUG oslo_vmware.api [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f594bc-1f27-f1ab-6f0f-4e1d7e5b78e2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2398.141415] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2398.141882] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2398.142165] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2398.160354] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2398.160865] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2398.161134] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2398.162089] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2398.162362] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2398.162597] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2398.162819] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e5ed059f-0390-480e-bafe-17092f272131] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2398.163038] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2398.163253] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2398.343472] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2398.343472] nova-compute[62208]: warnings.warn( [ 2398.349698] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2398.349958] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2398.350169] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2398.401392] nova-compute[62208]: DEBUG nova.compute.manager [req-ed29876e-9356-4a17-aad5-7b3f389b4199 req-d8b5e035-e442-406a-a533-cf8c1507d344 service nova] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Received event network-changed-5d86209a-a315-4be5-a3b4-3a6fa932d70a {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2398.401600] nova-compute[62208]: DEBUG nova.compute.manager [req-ed29876e-9356-4a17-aad5-7b3f389b4199 req-d8b5e035-e442-406a-a533-cf8c1507d344 service nova] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Refreshing instance network info cache due to event network-changed-5d86209a-a315-4be5-a3b4-3a6fa932d70a. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 2398.401816] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-ed29876e-9356-4a17-aad5-7b3f389b4199 req-d8b5e035-e442-406a-a533-cf8c1507d344 service nova] Acquiring lock "refresh_cache-c5f6249c-4435-4aad-99ce-f426c042a24a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2398.401957] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-ed29876e-9356-4a17-aad5-7b3f389b4199 req-d8b5e035-e442-406a-a533-cf8c1507d344 service nova] Acquired lock "refresh_cache-c5f6249c-4435-4aad-99ce-f426c042a24a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2398.402208] nova-compute[62208]: DEBUG nova.network.neutron [req-ed29876e-9356-4a17-aad5-7b3f389b4199 req-d8b5e035-e442-406a-a533-cf8c1507d344 service nova] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Refreshing network info cache for port 5d86209a-a315-4be5-a3b4-3a6fa932d70a {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2398.619967] nova-compute[62208]: DEBUG nova.network.neutron [req-ed29876e-9356-4a17-aad5-7b3f389b4199 req-d8b5e035-e442-406a-a533-cf8c1507d344 service nova] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Updated VIF entry in instance network info cache for port 5d86209a-a315-4be5-a3b4-3a6fa932d70a. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2398.620341] nova-compute[62208]: DEBUG nova.network.neutron [req-ed29876e-9356-4a17-aad5-7b3f389b4199 req-d8b5e035-e442-406a-a533-cf8c1507d344 service nova] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Updating instance_info_cache with network_info: [{"id": "5d86209a-a315-4be5-a3b4-3a6fa932d70a", "address": "fa:16:3e:41:2a:a5", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5d86209a-a3", "ovs_interfaceid": "5d86209a-a315-4be5-a3b4-3a6fa932d70a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2398.629964] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-ed29876e-9356-4a17-aad5-7b3f389b4199 req-d8b5e035-e442-406a-a533-cf8c1507d344 service nova] Releasing lock "refresh_cache-c5f6249c-4435-4aad-99ce-f426c042a24a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2400.140793] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2400.611651] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7d900c48-e62e-448e-892b-0a03dc07951e tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Acquiring lock "91c64da9-f295-4e84-a8bd-149a72a239da" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2401.140950] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2402.141354] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2402.151951] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2402.152228] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2402.152458] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2402.152549] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2402.154060] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60ff41ca-b445-4028-a10d-8bad22add678 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.157030] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2402.157030] nova-compute[62208]: warnings.warn( [ 2402.163115] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e43e6d4a-f97e-409d-8b12-4110f2afb06d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.167149] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2402.167149] nova-compute[62208]: warnings.warn( [ 2402.179607] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb8936f3-1b55-44b5-a7ae-50c26c89b32a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.182047] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2402.182047] nova-compute[62208]: warnings.warn( [ 2402.187079] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61a80b87-8441-4d28-8ec9-543b29fc8c74 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.191978] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2402.191978] nova-compute[62208]: warnings.warn( [ 2402.218878] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181950MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2402.219044] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2402.219245] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2402.278574] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2402.278725] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec568c91-b110-4c2a-8d62-8127c7781d03 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2402.278852] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b63cd2f-0b14-4008-b564-0078d3e0e20a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2402.278973] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 91c64da9-f295-4e84-a8bd-149a72a239da actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2402.279088] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 75ca5bb3-c856-4548-924f-3ff3614b0f63 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2402.279201] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 655e577b-5034-4669-8fbd-8495671dd385 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2402.279338] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e5ed059f-0390-480e-bafe-17092f272131 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2402.279461] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c5f6249c-4435-4aad-99ce-f426c042a24a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2402.279650] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2402.279782] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2402.388858] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2f59812-11a8-4632-b64f-7d26d9ed17a1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.391336] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2402.391336] nova-compute[62208]: warnings.warn( [ 2402.396868] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e49d416e-5df6-4010-9dd3-6242172fa3e0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.400153] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2402.400153] nova-compute[62208]: warnings.warn( [ 2402.427679] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b2927d7-fae6-408b-b0f1-a85d60a17eb8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.430221] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2402.430221] nova-compute[62208]: warnings.warn( [ 2402.435865] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e438c58d-390b-4b57-88a5-e53f13483025 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.441006] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2402.441006] nova-compute[62208]: warnings.warn( [ 2402.451137] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2402.459505] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2402.476943] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2402.477218] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2403.477626] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2403.477943] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2404.141810] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2404.359025] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_power_states {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2404.376937] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Getting list of instances from cluster (obj){ [ 2404.376937] nova-compute[62208]: value = "domain-c8" [ 2404.376937] nova-compute[62208]: _type = "ClusterComputeResource" [ 2404.376937] nova-compute[62208]: } {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2404.378330] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e449809-ae9f-4d1e-af0a-a5f8d0614882 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2404.381903] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2404.381903] nova-compute[62208]: warnings.warn( [ 2404.395255] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Got total of 8 instances {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2404.395429] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 2404.395617] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid ec568c91-b110-4c2a-8d62-8127c7781d03 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 2404.395775] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 5b63cd2f-0b14-4008-b564-0078d3e0e20a {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 2404.395923] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 91c64da9-f295-4e84-a8bd-149a72a239da {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 2404.396098] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 75ca5bb3-c856-4548-924f-3ff3614b0f63 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 2404.396237] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 655e577b-5034-4669-8fbd-8495671dd385 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 2404.396384] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid e5ed059f-0390-480e-bafe-17092f272131 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 2404.396531] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid c5f6249c-4435-4aad-99ce-f426c042a24a {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 2404.396847] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2404.397081] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "ec568c91-b110-4c2a-8d62-8127c7781d03" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2404.397323] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2404.397543] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "91c64da9-f295-4e84-a8bd-149a72a239da" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2404.397758] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "75ca5bb3-c856-4548-924f-3ff3614b0f63" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2404.397960] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "655e577b-5034-4669-8fbd-8495671dd385" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2404.398232] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "e5ed059f-0390-480e-bafe-17092f272131" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2404.398404] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "c5f6249c-4435-4aad-99ce-f426c042a24a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2405.175082] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2436.442754] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2436.442754] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2436.442754] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2436.442754] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2436.442754] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2436.442754] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2436.442754] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2436.442754] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2436.442754] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2436.442754] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2436.442754] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2436.442754] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2436.443810] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/db05a386-bbe1-4780-9286-9e8fcabbe668/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2436.445272] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2436.445552] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Copying Virtual Disk [datastore2] vmware_temp/db05a386-bbe1-4780-9286-9e8fcabbe668/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/db05a386-bbe1-4780-9286-9e8fcabbe668/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2436.445853] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7290594f-1b02-4f2a-a23c-dfad4e717e58 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2436.448544] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2436.448544] nova-compute[62208]: warnings.warn( [ 2436.455165] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2436.455165] nova-compute[62208]: value = "task-38702" [ 2436.455165] nova-compute[62208]: _type = "Task" [ 2436.455165] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2436.458815] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2436.458815] nova-compute[62208]: warnings.warn( [ 2436.464312] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38702, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2436.959430] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2436.959430] nova-compute[62208]: warnings.warn( [ 2436.965515] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2436.965802] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2436.966411] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Traceback (most recent call last): [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] yield resources [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] self.driver.spawn(context, instance, image_meta, [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] self._fetch_image_if_missing(context, vi) [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] image_cache(vi, tmp_image_ds_loc) [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] vm_util.copy_virtual_disk( [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] session._wait_for_task(vmdk_copy_task) [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] return self.wait_for_task(task_ref) [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] return evt.wait() [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] result = hub.switch() [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] return self.greenlet.switch() [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] self.f(*self.args, **self.kw) [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] raise exceptions.translate_fault(task_info.error) [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Faults: ['InvalidArgument'] [ 2436.966411] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] [ 2436.969171] nova-compute[62208]: INFO nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Terminating instance [ 2436.969171] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2436.969171] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2436.969171] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-59366ae3-14a4-4543-bc60-dea3acb0d36c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2436.971333] nova-compute[62208]: DEBUG nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2436.971517] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2436.972288] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99aae1a5-e980-45c9-a050-2949d083114a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2436.974576] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2436.974576] nova-compute[62208]: warnings.warn( [ 2436.974920] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2436.974920] nova-compute[62208]: warnings.warn( [ 2436.979545] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2436.979777] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b391f8bb-8e15-4f71-8cbc-8e0448a653e3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2436.982212] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2436.982381] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2436.982970] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2436.982970] nova-compute[62208]: warnings.warn( [ 2436.983400] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d4e99905-8f3e-4e65-8bdd-0a13c43a0496 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2436.985631] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2436.985631] nova-compute[62208]: warnings.warn( [ 2436.988525] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 2436.988525] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]523bc2c0-0ee5-7b58-4f4c-975d475255c1" [ 2436.988525] nova-compute[62208]: _type = "Task" [ 2436.988525] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2436.991202] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2436.991202] nova-compute[62208]: warnings.warn( [ 2436.995966] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]523bc2c0-0ee5-7b58-4f4c-975d475255c1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2437.050498] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2437.050791] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2437.050911] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleting the datastore file [datastore2] 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2437.051222] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a1936006-6872-4143-92d1-2629ed4512c9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2437.053156] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.053156] nova-compute[62208]: warnings.warn( [ 2437.057951] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2437.057951] nova-compute[62208]: value = "task-38704" [ 2437.057951] nova-compute[62208]: _type = "Task" [ 2437.057951] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2437.062858] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.062858] nova-compute[62208]: warnings.warn( [ 2437.067748] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38704, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2437.492692] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.492692] nova-compute[62208]: warnings.warn( [ 2437.498995] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2437.499236] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Creating directory with path [datastore2] vmware_temp/ca800e89-918a-438c-9756-24e91af1196f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2437.499504] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d881ca9b-0c39-4d5a-8ea1-df506fe83f23 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2437.501188] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.501188] nova-compute[62208]: warnings.warn( [ 2437.511139] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Created directory with path [datastore2] vmware_temp/ca800e89-918a-438c-9756-24e91af1196f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2437.511332] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Fetch image to [datastore2] vmware_temp/ca800e89-918a-438c-9756-24e91af1196f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2437.511502] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/ca800e89-918a-438c-9756-24e91af1196f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2437.512271] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93eb0309-670c-4558-a3bd-23eb51117182 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2437.514530] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.514530] nova-compute[62208]: warnings.warn( [ 2437.519082] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cdba836-9967-487d-a23a-24b0962495b5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2437.521252] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.521252] nova-compute[62208]: warnings.warn( [ 2437.528254] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20de9af3-8e48-4c7b-824c-ded64f174aea {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2437.531704] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.531704] nova-compute[62208]: warnings.warn( [ 2437.558151] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2ccf1f5-afa2-49fc-a0d6-f18e0095380f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2437.563258] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.563258] nova-compute[62208]: warnings.warn( [ 2437.563650] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.563650] nova-compute[62208]: warnings.warn( [ 2437.569829] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ea2e29eb-6fc8-4c2e-9dad-700e24d1e1f6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2437.571624] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38704, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079504} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2437.571868] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2437.572064] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2437.572236] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2437.572411] nova-compute[62208]: INFO nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2437.573814] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.573814] nova-compute[62208]: warnings.warn( [ 2437.574525] nova-compute[62208]: DEBUG nova.compute.claims [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936a67880> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2437.574694] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2437.574926] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2437.595200] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2437.718603] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ca800e89-918a-438c-9756-24e91af1196f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2437.774912] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ced2a991-a66f-4488-abd4-236ac2a3fc80 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2437.778624] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.778624] nova-compute[62208]: warnings.warn( [ 2437.779671] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2437.779856] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ca800e89-918a-438c-9756-24e91af1196f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2437.784283] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f94914c-fc22-4f4c-892a-323e6639c277 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2437.788484] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.788484] nova-compute[62208]: warnings.warn( [ 2437.817441] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b77b2331-937d-4380-86dc-632fcd936303 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2437.819989] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.819989] nova-compute[62208]: warnings.warn( [ 2437.825507] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eebe543-332f-4fc1-a5d8-d0abdcb62c7b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2437.829311] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2437.829311] nova-compute[62208]: warnings.warn( [ 2437.839546] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2437.848827] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2437.865210] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.290s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2437.865807] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Traceback (most recent call last): [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] self.driver.spawn(context, instance, image_meta, [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] self._fetch_image_if_missing(context, vi) [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] image_cache(vi, tmp_image_ds_loc) [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] vm_util.copy_virtual_disk( [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] session._wait_for_task(vmdk_copy_task) [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] return self.wait_for_task(task_ref) [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] return evt.wait() [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] result = hub.switch() [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] return self.greenlet.switch() [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] self.f(*self.args, **self.kw) [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] raise exceptions.translate_fault(task_info.error) [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Faults: ['InvalidArgument'] [ 2437.865807] nova-compute[62208]: ERROR nova.compute.manager [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] [ 2437.866705] nova-compute[62208]: DEBUG nova.compute.utils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2437.868063] nova-compute[62208]: DEBUG nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Build of instance 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2 was re-scheduled: A specified parameter was not correct: fileType [ 2437.868063] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2437.868496] nova-compute[62208]: DEBUG nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2437.868675] nova-compute[62208]: DEBUG nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2437.868847] nova-compute[62208]: DEBUG nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2437.869011] nova-compute[62208]: DEBUG nova.network.neutron [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2438.147450] nova-compute[62208]: DEBUG nova.network.neutron [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2438.162790] nova-compute[62208]: INFO nova.compute.manager [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Took 0.29 seconds to deallocate network for instance. [ 2438.266633] nova-compute[62208]: INFO nova.scheduler.client.report [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleted allocations for instance 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2 [ 2438.286192] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2e21fb24-0ac9-492b-b56c-7fc748f0c1da tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 483.774s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2438.286512] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ce8faadd-07d0-4741-b532-c61f5a060a57 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 288.466s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2438.286732] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ce8faadd-07d0-4741-b532-c61f5a060a57 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2438.286935] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ce8faadd-07d0-4741-b532-c61f5a060a57 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2438.287095] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ce8faadd-07d0-4741-b532-c61f5a060a57 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2438.289164] nova-compute[62208]: INFO nova.compute.manager [None req-ce8faadd-07d0-4741-b532-c61f5a060a57 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Terminating instance [ 2438.290857] nova-compute[62208]: DEBUG nova.compute.manager [None req-ce8faadd-07d0-4741-b532-c61f5a060a57 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2438.291041] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ce8faadd-07d0-4741-b532-c61f5a060a57 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2438.291504] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ced30676-fa75-4e57-80da-782c6fa37d52 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2438.294057] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2438.294057] nova-compute[62208]: warnings.warn( [ 2438.300933] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63cfd0f8-c4fc-4961-9694-89506e1e4560 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2438.311604] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2438.311604] nova-compute[62208]: warnings.warn( [ 2438.329703] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-ce8faadd-07d0-4741-b532-c61f5a060a57 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2 could not be found. [ 2438.329899] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-ce8faadd-07d0-4741-b532-c61f5a060a57 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2438.330075] nova-compute[62208]: INFO nova.compute.manager [None req-ce8faadd-07d0-4741-b532-c61f5a060a57 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2438.330320] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-ce8faadd-07d0-4741-b532-c61f5a060a57 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2438.330868] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2438.330868] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2438.356027] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2438.365200] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] Took 0.03 seconds to deallocate network for instance. [ 2438.459471] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-ce8faadd-07d0-4741-b532-c61f5a060a57 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.173s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2438.460360] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 34.063s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2438.460554] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 597a5acd-7d1f-4c5c-bc6e-40fee5e977e2] During sync_power_state the instance has a pending task (deleting). Skip. [ 2438.460734] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "597a5acd-7d1f-4c5c-bc6e-40fee5e977e2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2442.685476] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0863b084-991e-46a1-9efa-3d0feed04ac7 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "75ca5bb3-c856-4548-924f-3ff3614b0f63" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2455.140910] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2458.142534] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2458.142817] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2458.142864] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2458.160155] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2458.160323] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2458.160442] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2458.160550] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2458.160667] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2458.160782] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e5ed059f-0390-480e-bafe-17092f272131] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2458.160899] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2458.161013] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2459.140732] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2461.140868] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2461.141183] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2463.141337] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2463.151790] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2463.152049] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2463.152239] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2463.152399] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2463.153619] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b114e81-3c80-42db-a9e3-a195e94ce7a1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2463.156936] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2463.156936] nova-compute[62208]: warnings.warn( [ 2463.163261] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64f24cb4-c9ba-4424-9bcb-bb926d86d02e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2463.167229] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2463.167229] nova-compute[62208]: warnings.warn( [ 2463.179793] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91149aff-d347-4088-be98-1072c65af2fe {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2463.182218] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2463.182218] nova-compute[62208]: warnings.warn( [ 2463.187374] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb2bbc72-d4c1-43cb-a826-242b33e75bd7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2463.190383] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2463.190383] nova-compute[62208]: warnings.warn( [ 2463.216952] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181948MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2463.217066] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2463.217272] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2463.278725] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance ec568c91-b110-4c2a-8d62-8127c7781d03 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2463.278809] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b63cd2f-0b14-4008-b564-0078d3e0e20a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2463.278950] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 91c64da9-f295-4e84-a8bd-149a72a239da actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2463.279082] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 75ca5bb3-c856-4548-924f-3ff3614b0f63 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2463.279198] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 655e577b-5034-4669-8fbd-8495671dd385 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2463.279335] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e5ed059f-0390-480e-bafe-17092f272131 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2463.279473] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c5f6249c-4435-4aad-99ce-f426c042a24a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2463.279695] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2463.279775] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2463.391673] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b101db61-478c-410d-95eb-a34c0fa8fe5b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2463.394378] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2463.394378] nova-compute[62208]: warnings.warn( [ 2463.400612] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-092cfc8a-754e-4722-8ba3-bcdfbe7595e6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2463.404048] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2463.404048] nova-compute[62208]: warnings.warn( [ 2463.433820] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47e93ab9-74a5-4fc6-a505-ecc4eb4718c1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2463.436494] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2463.436494] nova-compute[62208]: warnings.warn( [ 2463.443536] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dec1848-bcd0-44df-a973-7aeef2bcff89 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2463.447867] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2463.447867] nova-compute[62208]: warnings.warn( [ 2463.458964] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2463.467963] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2463.487291] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2463.487415] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.270s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2464.487693] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2465.136462] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2465.141242] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2465.141415] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2477.136652] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2487.253776] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2487.253776] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2487.253776] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2487.253776] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2487.253776] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2487.253776] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2487.253776] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2487.253776] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2487.253776] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2487.253776] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2487.253776] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2487.253776] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2487.253776] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/ca800e89-918a-438c-9756-24e91af1196f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2487.255886] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2487.256165] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Copying Virtual Disk [datastore2] vmware_temp/ca800e89-918a-438c-9756-24e91af1196f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/ca800e89-918a-438c-9756-24e91af1196f/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2487.256467] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-23af96bc-4765-4f09-90a4-aa4315001bab {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2487.259172] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2487.259172] nova-compute[62208]: warnings.warn( [ 2487.266462] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 2487.266462] nova-compute[62208]: value = "task-38705" [ 2487.266462] nova-compute[62208]: _type = "Task" [ 2487.266462] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2487.269865] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2487.269865] nova-compute[62208]: warnings.warn( [ 2487.275853] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38705, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2487.770936] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2487.770936] nova-compute[62208]: warnings.warn( [ 2487.777194] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2487.777478] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2487.778057] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Traceback (most recent call last): [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] yield resources [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] self.driver.spawn(context, instance, image_meta, [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] self._fetch_image_if_missing(context, vi) [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] image_cache(vi, tmp_image_ds_loc) [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] vm_util.copy_virtual_disk( [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] session._wait_for_task(vmdk_copy_task) [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] return self.wait_for_task(task_ref) [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] return evt.wait() [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] result = hub.switch() [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] return self.greenlet.switch() [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] self.f(*self.args, **self.kw) [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] raise exceptions.translate_fault(task_info.error) [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Faults: ['InvalidArgument'] [ 2487.778057] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] [ 2487.779003] nova-compute[62208]: INFO nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Terminating instance [ 2487.780971] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2487.781162] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2487.781804] nova-compute[62208]: DEBUG nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2487.781998] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2487.782248] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bafda975-e3ec-4023-ab75-a8b1bee7f885 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2487.784948] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-244c516b-8fd4-45d1-ad36-f575781989fd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2487.787301] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2487.787301] nova-compute[62208]: warnings.warn( [ 2487.787661] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2487.787661] nova-compute[62208]: warnings.warn( [ 2487.792524] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2487.792793] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-abd5d2af-23c7-4b6d-af13-bec1d0e483a4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2487.795350] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2487.795584] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2487.796248] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2487.796248] nova-compute[62208]: warnings.warn( [ 2487.796741] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d6b3ae3e-dc0a-469e-a43a-d5bf9c5b8be5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2487.799006] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2487.799006] nova-compute[62208]: warnings.warn( [ 2487.802091] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2487.802091] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]527862d2-f9f0-24eb-c401-b2d8e440014d" [ 2487.802091] nova-compute[62208]: _type = "Task" [ 2487.802091] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2487.805058] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2487.805058] nova-compute[62208]: warnings.warn( [ 2487.811189] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]527862d2-f9f0-24eb-c401-b2d8e440014d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2487.873411] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2487.873824] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2487.874145] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Deleting the datastore file [datastore2] ec568c91-b110-4c2a-8d62-8127c7781d03 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2487.875020] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1c0714ed-3492-4a49-a629-bf4fb41af817 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2487.877547] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2487.877547] nova-compute[62208]: warnings.warn( [ 2487.883304] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for the task: (returnval){ [ 2487.883304] nova-compute[62208]: value = "task-38707" [ 2487.883304] nova-compute[62208]: _type = "Task" [ 2487.883304] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2487.886794] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2487.886794] nova-compute[62208]: warnings.warn( [ 2487.892214] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38707, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2488.306193] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2488.306193] nova-compute[62208]: warnings.warn( [ 2488.312413] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2488.312666] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating directory with path [datastore2] vmware_temp/38685930-1d48-444c-a78e-c792988fa0dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2488.312953] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1987528c-de2c-406b-9397-1bf7c91da1fc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2488.314778] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2488.314778] nova-compute[62208]: warnings.warn( [ 2488.324914] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created directory with path [datastore2] vmware_temp/38685930-1d48-444c-a78e-c792988fa0dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2488.325109] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Fetch image to [datastore2] vmware_temp/38685930-1d48-444c-a78e-c792988fa0dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2488.325263] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/38685930-1d48-444c-a78e-c792988fa0dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2488.326038] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebf49bc5-1031-4eb7-b06f-0e209edff8f4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2488.328474] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2488.328474] nova-compute[62208]: warnings.warn( [ 2488.334710] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bfd4ec1-7b6b-452c-9cb9-038bd01ce641 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2488.337054] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2488.337054] nova-compute[62208]: warnings.warn( [ 2488.344516] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-805dea13-3ac0-4b15-bbc6-869a0deee47e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2488.348101] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2488.348101] nova-compute[62208]: warnings.warn( [ 2488.375334] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea9ed80d-6432-40f4-b7c1-a35df48ee59b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2488.377795] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2488.377795] nova-compute[62208]: warnings.warn( [ 2488.382041] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c7b7935f-77f1-4d70-9b01-f0c6309de67d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2488.386654] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2488.386654] nova-compute[62208]: warnings.warn( [ 2488.387000] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2488.387000] nova-compute[62208]: warnings.warn( [ 2488.392226] nova-compute[62208]: DEBUG oslo_vmware.api [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Task: {'id': task-38707, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084499} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2488.392495] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2488.392681] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2488.392855] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2488.393030] nova-compute[62208]: INFO nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2488.395259] nova-compute[62208]: DEBUG nova.compute.claims [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9376357b0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2488.395439] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2488.395652] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2488.412121] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2488.463537] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/38685930-1d48-444c-a78e-c792988fa0dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2488.521702] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2488.521895] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/38685930-1d48-444c-a78e-c792988fa0dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2488.588445] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de2d954b-b253-42a8-a417-d0210b5fc0b5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2488.591023] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2488.591023] nova-compute[62208]: warnings.warn( [ 2488.596341] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca07421a-e788-416d-b49a-148247e59c4f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2488.599672] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2488.599672] nova-compute[62208]: warnings.warn( [ 2488.627921] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05d6bbd0-6939-4bbd-93df-9a922145a497 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2488.630400] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2488.630400] nova-compute[62208]: warnings.warn( [ 2488.635809] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb6882f4-a54e-47fa-9eab-369c877e3da3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2488.639526] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2488.639526] nova-compute[62208]: warnings.warn( [ 2488.649200] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2488.657827] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2488.674345] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.279s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2488.674909] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Traceback (most recent call last): [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] self.driver.spawn(context, instance, image_meta, [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] self._fetch_image_if_missing(context, vi) [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] image_cache(vi, tmp_image_ds_loc) [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] vm_util.copy_virtual_disk( [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] session._wait_for_task(vmdk_copy_task) [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] return self.wait_for_task(task_ref) [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] return evt.wait() [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] result = hub.switch() [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] return self.greenlet.switch() [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] self.f(*self.args, **self.kw) [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] raise exceptions.translate_fault(task_info.error) [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Faults: ['InvalidArgument'] [ 2488.674909] nova-compute[62208]: ERROR nova.compute.manager [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] [ 2488.675717] nova-compute[62208]: DEBUG nova.compute.utils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2488.677134] nova-compute[62208]: DEBUG nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Build of instance ec568c91-b110-4c2a-8d62-8127c7781d03 was re-scheduled: A specified parameter was not correct: fileType [ 2488.677134] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2488.677620] nova-compute[62208]: DEBUG nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2488.677934] nova-compute[62208]: DEBUG nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2488.678133] nova-compute[62208]: DEBUG nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2488.678346] nova-compute[62208]: DEBUG nova.network.neutron [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2488.944058] nova-compute[62208]: DEBUG nova.network.neutron [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2488.957848] nova-compute[62208]: INFO nova.compute.manager [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Took 0.28 seconds to deallocate network for instance. [ 2489.078944] nova-compute[62208]: INFO nova.scheduler.client.report [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Deleted allocations for instance ec568c91-b110-4c2a-8d62-8127c7781d03 [ 2489.100393] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-382ba2a1-3993-42ed-b8c3-0eb94e3334e4 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "ec568c91-b110-4c2a-8d62-8127c7781d03" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 498.449s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2489.100711] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0f08719d-7f89-416b-b573-42d34aa89625 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "ec568c91-b110-4c2a-8d62-8127c7781d03" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 301.892s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2489.100954] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0f08719d-7f89-416b-b573-42d34aa89625 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Acquiring lock "ec568c91-b110-4c2a-8d62-8127c7781d03-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2489.101170] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0f08719d-7f89-416b-b573-42d34aa89625 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "ec568c91-b110-4c2a-8d62-8127c7781d03-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2489.101355] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0f08719d-7f89-416b-b573-42d34aa89625 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "ec568c91-b110-4c2a-8d62-8127c7781d03-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2489.103859] nova-compute[62208]: INFO nova.compute.manager [None req-0f08719d-7f89-416b-b573-42d34aa89625 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Terminating instance [ 2489.105651] nova-compute[62208]: DEBUG nova.compute.manager [None req-0f08719d-7f89-416b-b573-42d34aa89625 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2489.105868] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0f08719d-7f89-416b-b573-42d34aa89625 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2489.106607] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5c122e36-9213-464e-b0cb-800f0fcac92c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2489.109462] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2489.109462] nova-compute[62208]: warnings.warn( [ 2489.117843] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40484467-695d-45a6-a970-a10b4292a0c9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2489.128674] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2489.128674] nova-compute[62208]: warnings.warn( [ 2489.145437] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-0f08719d-7f89-416b-b573-42d34aa89625 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ec568c91-b110-4c2a-8d62-8127c7781d03 could not be found. [ 2489.145647] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0f08719d-7f89-416b-b573-42d34aa89625 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2489.145826] nova-compute[62208]: INFO nova.compute.manager [None req-0f08719d-7f89-416b-b573-42d34aa89625 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2489.146072] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-0f08719d-7f89-416b-b573-42d34aa89625 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2489.146288] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2489.146378] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2489.173613] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2489.182889] nova-compute[62208]: INFO nova.compute.manager [-] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] Took 0.04 seconds to deallocate network for instance. [ 2489.278220] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0f08719d-7f89-416b-b573-42d34aa89625 tempest-AttachInterfacesTestJSON-656904357 tempest-AttachInterfacesTestJSON-656904357-project-member] Lock "ec568c91-b110-4c2a-8d62-8127c7781d03" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.177s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2489.279159] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "ec568c91-b110-4c2a-8d62-8127c7781d03" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 84.882s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2489.279472] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: ec568c91-b110-4c2a-8d62-8127c7781d03] During sync_power_state the instance has a pending task (deleting). Skip. [ 2489.279891] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "ec568c91-b110-4c2a-8d62-8127c7781d03" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2515.140531] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2519.141833] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2519.142170] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2519.142170] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2519.158401] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2519.158597] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2519.158722] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2519.158852] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2519.158974] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e5ed059f-0390-480e-bafe-17092f272131] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2519.159094] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2519.159212] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2520.141546] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2522.141044] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2522.141406] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2525.136681] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2525.140264] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2525.140421] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2525.140586] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2525.151069] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2525.151276] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2525.151437] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2525.151589] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2525.152651] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e857792e-682e-42b5-aab6-62bfd0d35b14 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2525.155405] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2525.155405] nova-compute[62208]: warnings.warn( [ 2525.161626] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0bab966-d8f3-4b27-8386-99c399820992 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2525.165170] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2525.165170] nova-compute[62208]: warnings.warn( [ 2525.175699] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55375caf-d92a-4e65-9c71-72b5a3d6bca9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2525.177929] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2525.177929] nova-compute[62208]: warnings.warn( [ 2525.182479] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c85c144b-74cd-4486-a0a1-f2c84e534e93 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2525.186457] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2525.186457] nova-compute[62208]: warnings.warn( [ 2525.214187] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181951MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2525.214354] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2525.214541] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2525.266365] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 5b63cd2f-0b14-4008-b564-0078d3e0e20a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2525.266522] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 91c64da9-f295-4e84-a8bd-149a72a239da actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2525.266650] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 75ca5bb3-c856-4548-924f-3ff3614b0f63 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2525.266772] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 655e577b-5034-4669-8fbd-8495671dd385 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2525.266891] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e5ed059f-0390-480e-bafe-17092f272131 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2525.267008] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c5f6249c-4435-4aad-99ce-f426c042a24a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2525.267182] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2525.267319] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2525.346360] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aac52be6-c2b1-4ede-9e77-1ae9c7934548 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2525.349420] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2525.349420] nova-compute[62208]: warnings.warn( [ 2525.354982] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dc93cee-17fe-4ed7-82fb-15588c2280ab {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2525.357966] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2525.357966] nova-compute[62208]: warnings.warn( [ 2525.385534] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05da3628-107c-46e7-9e49-f18d3877b8a1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2525.387910] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2525.387910] nova-compute[62208]: warnings.warn( [ 2525.393171] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c64ecb09-6431-4251-ace8-91fe3b79a52b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2525.396825] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2525.396825] nova-compute[62208]: warnings.warn( [ 2525.406225] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2525.414304] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2525.430722] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2525.430904] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.216s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2526.432318] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2535.148795] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2535.148795] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2535.148795] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2535.148795] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2535.148795] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2535.148795] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2535.148795] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2535.148795] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2535.148795] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2535.148795] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2535.148795] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2535.148795] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2535.149737] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/38685930-1d48-444c-a78e-c792988fa0dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2535.151207] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2535.151437] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Copying Virtual Disk [datastore2] vmware_temp/38685930-1d48-444c-a78e-c792988fa0dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/38685930-1d48-444c-a78e-c792988fa0dc/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2535.151754] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1b39375f-c857-4536-b413-778523de4379 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2535.154289] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2535.154289] nova-compute[62208]: warnings.warn( [ 2535.161062] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2535.161062] nova-compute[62208]: value = "task-38708" [ 2535.161062] nova-compute[62208]: _type = "Task" [ 2535.161062] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2535.164101] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2535.164101] nova-compute[62208]: warnings.warn( [ 2535.170851] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38708, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2535.665317] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2535.665317] nova-compute[62208]: warnings.warn( [ 2535.671426] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2535.671749] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2535.672313] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Traceback (most recent call last): [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] yield resources [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] self.driver.spawn(context, instance, image_meta, [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] self._fetch_image_if_missing(context, vi) [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] image_cache(vi, tmp_image_ds_loc) [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] vm_util.copy_virtual_disk( [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] session._wait_for_task(vmdk_copy_task) [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] return self.wait_for_task(task_ref) [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] return evt.wait() [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] result = hub.switch() [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] return self.greenlet.switch() [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] self.f(*self.args, **self.kw) [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] raise exceptions.translate_fault(task_info.error) [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Faults: ['InvalidArgument'] [ 2535.672313] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] [ 2535.674287] nova-compute[62208]: INFO nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Terminating instance [ 2535.674287] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2535.674429] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2535.674606] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9a3fbcd9-98f7-44fe-87be-f88851ce4077 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2535.677158] nova-compute[62208]: DEBUG nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2535.677158] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2535.677691] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fdd585a-b048-4271-9e26-c1682093757d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2535.680123] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2535.680123] nova-compute[62208]: warnings.warn( [ 2535.680459] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2535.680459] nova-compute[62208]: warnings.warn( [ 2535.684877] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2535.685131] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-42a747b7-a934-4b7f-bd35-93659e32ba49 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2535.687379] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2535.687552] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2535.688133] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2535.688133] nova-compute[62208]: warnings.warn( [ 2535.688560] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9afa49b2-72c0-4f1f-bd4f-cb37992fc6dd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2535.690472] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2535.690472] nova-compute[62208]: warnings.warn( [ 2535.693347] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Waiting for the task: (returnval){ [ 2535.693347] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522bb81d-7166-92c2-0257-3426a2e47edb" [ 2535.693347] nova-compute[62208]: _type = "Task" [ 2535.693347] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2535.696305] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2535.696305] nova-compute[62208]: warnings.warn( [ 2535.700872] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]522bb81d-7166-92c2-0257-3426a2e47edb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2535.762170] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2535.762536] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2535.762788] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleting the datastore file [datastore2] 5b63cd2f-0b14-4008-b564-0078d3e0e20a {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2535.763141] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8098ec2f-9f0b-4e08-9185-62524fd50e93 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2535.765016] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2535.765016] nova-compute[62208]: warnings.warn( [ 2535.770265] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2535.770265] nova-compute[62208]: value = "task-38710" [ 2535.770265] nova-compute[62208]: _type = "Task" [ 2535.770265] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2535.773428] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2535.773428] nova-compute[62208]: warnings.warn( [ 2535.778527] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38710, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2536.197435] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2536.197435] nova-compute[62208]: warnings.warn( [ 2536.203115] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2536.203333] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Creating directory with path [datastore2] vmware_temp/9827f181-6c03-488a-b885-023edefd724a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2536.203571] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4ac59335-e7b8-454b-8544-09a7c54f9b24 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2536.205337] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2536.205337] nova-compute[62208]: warnings.warn( [ 2536.214656] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Created directory with path [datastore2] vmware_temp/9827f181-6c03-488a-b885-023edefd724a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2536.214886] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Fetch image to [datastore2] vmware_temp/9827f181-6c03-488a-b885-023edefd724a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2536.215023] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/9827f181-6c03-488a-b885-023edefd724a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2536.215735] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ce87fe4-0803-4f25-beae-67ebb44e3577 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2536.217965] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2536.217965] nova-compute[62208]: warnings.warn( [ 2536.222275] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22fbf932-5aa7-4f22-bfec-0c3c234e3e71 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2536.224471] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2536.224471] nova-compute[62208]: warnings.warn( [ 2536.231438] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bffea5a9-ed9f-452e-8fe0-bacbbf5a6a59 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2536.235082] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2536.235082] nova-compute[62208]: warnings.warn( [ 2536.261542] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b396db24-b8f0-442e-971f-0fac9613b3d3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2536.263816] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2536.263816] nova-compute[62208]: warnings.warn( [ 2536.267610] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f22f3683-35b6-4705-9a6b-9d5432ecfe61 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2536.269224] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2536.269224] nova-compute[62208]: warnings.warn( [ 2536.274509] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2536.274509] nova-compute[62208]: warnings.warn( [ 2536.279536] nova-compute[62208]: DEBUG oslo_vmware.api [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38710, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066979} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2536.279784] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2536.279965] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2536.280152] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2536.280341] nova-compute[62208]: INFO nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2536.282475] nova-compute[62208]: DEBUG nova.compute.claims [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Aborting claim: <nova.compute.claims.Claim object at 0x7fb93636ca90> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2536.282647] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2536.282938] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2536.290998] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2536.341350] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9827f181-6c03-488a-b885-023edefd724a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2536.401609] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2536.401834] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9827f181-6c03-488a-b885-023edefd724a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2536.455046] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d2305eb-f896-48b7-bb8b-1a3cda7aac65 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2536.457508] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2536.457508] nova-compute[62208]: warnings.warn( [ 2536.462481] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e97d4ac2-9c4e-4b1c-9945-bbd37d6576cf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2536.465471] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2536.465471] nova-compute[62208]: warnings.warn( [ 2536.492806] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7820b843-4d85-41e2-8433-0b03703aac4e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2536.495619] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2536.495619] nova-compute[62208]: warnings.warn( [ 2536.500960] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f638048c-2495-439b-be8a-7419f7f1d556 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2536.504805] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2536.504805] nova-compute[62208]: warnings.warn( [ 2536.514524] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2536.523649] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2536.541758] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.259s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2536.542331] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Traceback (most recent call last): [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] self.driver.spawn(context, instance, image_meta, [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] self._fetch_image_if_missing(context, vi) [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] image_cache(vi, tmp_image_ds_loc) [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] vm_util.copy_virtual_disk( [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] session._wait_for_task(vmdk_copy_task) [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] return self.wait_for_task(task_ref) [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] return evt.wait() [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] result = hub.switch() [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] return self.greenlet.switch() [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] self.f(*self.args, **self.kw) [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] raise exceptions.translate_fault(task_info.error) [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Faults: ['InvalidArgument'] [ 2536.542331] nova-compute[62208]: ERROR nova.compute.manager [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] [ 2536.544021] nova-compute[62208]: DEBUG nova.compute.utils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2536.544471] nova-compute[62208]: DEBUG nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Build of instance 5b63cd2f-0b14-4008-b564-0078d3e0e20a was re-scheduled: A specified parameter was not correct: fileType [ 2536.544471] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2536.544842] nova-compute[62208]: DEBUG nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2536.544999] nova-compute[62208]: DEBUG nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2536.545167] nova-compute[62208]: DEBUG nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2536.545359] nova-compute[62208]: DEBUG nova.network.neutron [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2537.046398] nova-compute[62208]: DEBUG nova.network.neutron [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2537.065245] nova-compute[62208]: INFO nova.compute.manager [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Took 0.52 seconds to deallocate network for instance. [ 2537.171419] nova-compute[62208]: INFO nova.scheduler.client.report [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleted allocations for instance 5b63cd2f-0b14-4008-b564-0078d3e0e20a [ 2537.190726] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7cd657a8-63d9-4346-ba6f-35d4b1e46950 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 535.179s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2537.191029] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5f7f6285-4817-4ae3-b867-31c36ba043f5 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 338.737s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2537.191253] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5f7f6285-4817-4ae3-b867-31c36ba043f5 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2537.191499] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5f7f6285-4817-4ae3-b867-31c36ba043f5 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2537.191609] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5f7f6285-4817-4ae3-b867-31c36ba043f5 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2537.195142] nova-compute[62208]: INFO nova.compute.manager [None req-5f7f6285-4817-4ae3-b867-31c36ba043f5 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Terminating instance [ 2537.197255] nova-compute[62208]: DEBUG nova.compute.manager [None req-5f7f6285-4817-4ae3-b867-31c36ba043f5 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2537.197445] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5f7f6285-4817-4ae3-b867-31c36ba043f5 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2537.197738] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4d7ff463-2434-4a5b-a193-3a72633183ff {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2537.199792] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2537.199792] nova-compute[62208]: warnings.warn( [ 2537.207129] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99ebfd52-b957-4a37-9a78-ca469fc5bc71 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2537.217428] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2537.217428] nova-compute[62208]: warnings.warn( [ 2537.232982] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-5f7f6285-4817-4ae3-b867-31c36ba043f5 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5b63cd2f-0b14-4008-b564-0078d3e0e20a could not be found. [ 2537.233152] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-5f7f6285-4817-4ae3-b867-31c36ba043f5 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2537.233335] nova-compute[62208]: INFO nova.compute.manager [None req-5f7f6285-4817-4ae3-b867-31c36ba043f5 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2537.233586] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-5f7f6285-4817-4ae3-b867-31c36ba043f5 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2537.234282] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2537.234383] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2537.264796] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2537.275572] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] Took 0.04 seconds to deallocate network for instance. [ 2537.367333] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-5f7f6285-4817-4ae3-b867-31c36ba043f5 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.176s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2537.368888] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 132.971s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2537.369087] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 5b63cd2f-0b14-4008-b564-0078d3e0e20a] During sync_power_state the instance has a pending task (deleting). Skip. [ 2537.369271] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "5b63cd2f-0b14-4008-b564-0078d3e0e20a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2543.494383] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-dd452887-cc6a-4c29-bdd2-5a6aa9fe1698 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "655e577b-5034-4669-8fbd-8495671dd385" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2576.142525] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2579.141666] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2579.142030] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2579.142030] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2579.159489] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2579.159691] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2579.159830] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2579.159960] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e5ed059f-0390-480e-bafe-17092f272131] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2579.160249] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2579.160409] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2580.141773] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2581.488155] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2581.488155] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2581.488155] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2581.488155] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2581.488155] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2581.488155] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2581.488155] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2581.488155] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2581.488155] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2581.488155] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2581.488155] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2581.488155] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2581.488877] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/9827f181-6c03-488a-b885-023edefd724a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2581.490505] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2581.490747] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Copying Virtual Disk [datastore2] vmware_temp/9827f181-6c03-488a-b885-023edefd724a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/9827f181-6c03-488a-b885-023edefd724a/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2581.491037] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-026f213f-888d-48a8-b38e-8fe6e1b43b4e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2581.493401] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2581.493401] nova-compute[62208]: warnings.warn( [ 2581.499397] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Waiting for the task: (returnval){ [ 2581.499397] nova-compute[62208]: value = "task-38711" [ 2581.499397] nova-compute[62208]: _type = "Task" [ 2581.499397] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2581.503269] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2581.503269] nova-compute[62208]: warnings.warn( [ 2581.509061] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Task: {'id': task-38711, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2582.003628] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.003628] nova-compute[62208]: warnings.warn( [ 2582.010009] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2582.010577] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2582.010841] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Traceback (most recent call last): [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] yield resources [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] self.driver.spawn(context, instance, image_meta, [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] self._fetch_image_if_missing(context, vi) [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] image_cache(vi, tmp_image_ds_loc) [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] vm_util.copy_virtual_disk( [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] session._wait_for_task(vmdk_copy_task) [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] return self.wait_for_task(task_ref) [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] return evt.wait() [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] result = hub.switch() [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] return self.greenlet.switch() [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] self.f(*self.args, **self.kw) [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] raise exceptions.translate_fault(task_info.error) [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Faults: ['InvalidArgument'] [ 2582.010841] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] [ 2582.012062] nova-compute[62208]: INFO nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Terminating instance [ 2582.014077] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2582.014274] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2582.014573] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2582.014769] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2582.015524] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-268bbef9-ede8-4287-9ed0-aad1130e5551 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.018193] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cf180dea-9e44-44e9-a919-91e1e1a00ee5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.020231] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.020231] nova-compute[62208]: warnings.warn( [ 2582.020566] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.020566] nova-compute[62208]: warnings.warn( [ 2582.024557] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2582.024810] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-243dd711-fffa-4496-9ade-20f1e66d6cfd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.027178] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2582.027355] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2582.027938] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.027938] nova-compute[62208]: warnings.warn( [ 2582.028375] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-59264596-6fea-4b38-853d-da89ea3ebf00 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.030292] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.030292] nova-compute[62208]: warnings.warn( [ 2582.033489] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 2582.033489] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]521e887a-1a19-2781-45d3-060ec1d293f3" [ 2582.033489] nova-compute[62208]: _type = "Task" [ 2582.033489] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2582.036619] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.036619] nova-compute[62208]: warnings.warn( [ 2582.047725] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2582.047982] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating directory with path [datastore2] vmware_temp/3a9bea9a-e4fb-4593-8b89-e40794901608/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2582.048234] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ed7f5b31-60cf-4517-bf23-9b4d46ed6ea1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.049894] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.049894] nova-compute[62208]: warnings.warn( [ 2582.068737] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Created directory with path [datastore2] vmware_temp/3a9bea9a-e4fb-4593-8b89-e40794901608/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2582.069049] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Fetch image to [datastore2] vmware_temp/3a9bea9a-e4fb-4593-8b89-e40794901608/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2582.069362] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/3a9bea9a-e4fb-4593-8b89-e40794901608/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2582.070186] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47351312-262b-48a4-ba90-6dddff2f4082 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.072832] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.072832] nova-compute[62208]: warnings.warn( [ 2582.077989] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82d661fc-0e8c-4212-9d13-cc14f09c8230 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.080423] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.080423] nova-compute[62208]: warnings.warn( [ 2582.087733] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d0a1e78-7923-4702-9a2e-b98b1077672c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.091438] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.091438] nova-compute[62208]: warnings.warn( [ 2582.123965] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60dacc0c-cf23-4aad-b97b-b1dc4887bc09 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.126706] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2582.126903] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2582.127077] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Deleting the datastore file [datastore2] 91c64da9-f295-4e84-a8bd-149a72a239da {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2582.127311] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2e8887d4-be76-4118-b161-c50900fc2e4c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.128849] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.128849] nova-compute[62208]: warnings.warn( [ 2582.129241] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.129241] nova-compute[62208]: warnings.warn( [ 2582.133583] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b95a626f-ffeb-4c01-ac05-86d47799ce6f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.135400] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Waiting for the task: (returnval){ [ 2582.135400] nova-compute[62208]: value = "task-38713" [ 2582.135400] nova-compute[62208]: _type = "Task" [ 2582.135400] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2582.135671] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.135671] nova-compute[62208]: warnings.warn( [ 2582.138690] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.138690] nova-compute[62208]: warnings.warn( [ 2582.141398] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2582.144751] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Task: {'id': task-38713, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2582.159664] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2582.210204] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3a9bea9a-e4fb-4593-8b89-e40794901608/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2582.265184] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2582.265338] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3a9bea9a-e4fb-4593-8b89-e40794901608/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2582.639525] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.639525] nova-compute[62208]: warnings.warn( [ 2582.645252] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Task: {'id': task-38713, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066659} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2582.645500] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2582.645691] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2582.645888] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2582.646058] nova-compute[62208]: INFO nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Took 0.63 seconds to destroy the instance on the hypervisor. [ 2582.648125] nova-compute[62208]: DEBUG nova.compute.claims [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9364a2920> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2582.648303] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2582.648545] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2582.758061] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a6e29f5-359d-45d2-b092-b43eeb5e5629 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.760507] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.760507] nova-compute[62208]: warnings.warn( [ 2582.765680] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fdb2067-7d23-46e7-9df4-b5b010cdae0d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.768865] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.768865] nova-compute[62208]: warnings.warn( [ 2582.794888] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2cf41b8-056e-4ca7-8598-acc1d09e9916 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.797209] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.797209] nova-compute[62208]: warnings.warn( [ 2582.802431] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e61d3eaa-9e04-4786-9b6f-609f58ceb5bc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2582.805988] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2582.805988] nova-compute[62208]: warnings.warn( [ 2582.815268] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2582.823912] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2582.840598] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.192s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2582.841131] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Traceback (most recent call last): [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] self.driver.spawn(context, instance, image_meta, [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] self._fetch_image_if_missing(context, vi) [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] image_cache(vi, tmp_image_ds_loc) [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] vm_util.copy_virtual_disk( [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] session._wait_for_task(vmdk_copy_task) [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] return self.wait_for_task(task_ref) [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] return evt.wait() [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] result = hub.switch() [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] return self.greenlet.switch() [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] self.f(*self.args, **self.kw) [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] raise exceptions.translate_fault(task_info.error) [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Faults: ['InvalidArgument'] [ 2582.841131] nova-compute[62208]: ERROR nova.compute.manager [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] [ 2582.842022] nova-compute[62208]: DEBUG nova.compute.utils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2582.843317] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Build of instance 91c64da9-f295-4e84-a8bd-149a72a239da was re-scheduled: A specified parameter was not correct: fileType [ 2582.843317] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2582.843684] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2582.843860] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2582.844043] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2582.844216] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2583.097338] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2583.111910] nova-compute[62208]: INFO nova.compute.manager [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Took 0.27 seconds to deallocate network for instance. [ 2583.143760] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2583.211566] nova-compute[62208]: INFO nova.scheduler.client.report [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Deleted allocations for instance 91c64da9-f295-4e84-a8bd-149a72a239da [ 2583.231169] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe0e156-7571-4f7d-826e-4cda02e5d735 tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Lock "91c64da9-f295-4e84-a8bd-149a72a239da" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 378.078s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2583.231436] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7d900c48-e62e-448e-892b-0a03dc07951e tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Lock "91c64da9-f295-4e84-a8bd-149a72a239da" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 182.620s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2583.231681] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7d900c48-e62e-448e-892b-0a03dc07951e tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Acquiring lock "91c64da9-f295-4e84-a8bd-149a72a239da-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2583.231861] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7d900c48-e62e-448e-892b-0a03dc07951e tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Lock "91c64da9-f295-4e84-a8bd-149a72a239da-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2583.232039] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7d900c48-e62e-448e-892b-0a03dc07951e tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Lock "91c64da9-f295-4e84-a8bd-149a72a239da-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2583.234286] nova-compute[62208]: INFO nova.compute.manager [None req-7d900c48-e62e-448e-892b-0a03dc07951e tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Terminating instance [ 2583.235954] nova-compute[62208]: DEBUG nova.compute.manager [None req-7d900c48-e62e-448e-892b-0a03dc07951e tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2583.236189] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7d900c48-e62e-448e-892b-0a03dc07951e tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2583.236867] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-66d5764f-2b6f-4396-9927-d7f350f33da7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2583.239067] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2583.239067] nova-compute[62208]: warnings.warn( [ 2583.246278] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9922147-f978-4d6b-97d7-5c36915a905a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2583.256727] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2583.256727] nova-compute[62208]: warnings.warn( [ 2583.271459] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-7d900c48-e62e-448e-892b-0a03dc07951e tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 91c64da9-f295-4e84-a8bd-149a72a239da could not be found. [ 2583.271673] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-7d900c48-e62e-448e-892b-0a03dc07951e tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2583.271856] nova-compute[62208]: INFO nova.compute.manager [None req-7d900c48-e62e-448e-892b-0a03dc07951e tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2583.272115] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-7d900c48-e62e-448e-892b-0a03dc07951e tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2583.272339] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2583.272456] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2583.298235] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2583.306726] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] Took 0.03 seconds to deallocate network for instance. [ 2583.401370] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-7d900c48-e62e-448e-892b-0a03dc07951e tempest-ImagesOneServerNegativeTestJSON-1483459504 tempest-ImagesOneServerNegativeTestJSON-1483459504-project-member] Lock "91c64da9-f295-4e84-a8bd-149a72a239da" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.170s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2583.402225] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "91c64da9-f295-4e84-a8bd-149a72a239da" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 179.005s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2583.402409] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 91c64da9-f295-4e84-a8bd-149a72a239da] During sync_power_state the instance has a pending task (deleting). Skip. [ 2583.402582] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "91c64da9-f295-4e84-a8bd-149a72a239da" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2585.141452] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2585.141452] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2585.932863] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2ecf536-092c-483b-94aa-271a8aeddf87 tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Acquiring lock "e5ed059f-0390-480e-bafe-17092f272131" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2586.135950] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2587.140929] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2587.152464] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2587.152718] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2587.152892] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2587.153092] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2587.154212] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-425a0747-e35f-4ddf-aed3-7a01dce0287e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2587.157447] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2587.157447] nova-compute[62208]: warnings.warn( [ 2587.164386] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4da7a28-a71c-4126-9744-6475b0144455 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2587.170393] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2587.170393] nova-compute[62208]: warnings.warn( [ 2587.182550] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fb6f09f-5c69-4bdc-92ab-1a3084cc01db {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2587.185113] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2587.185113] nova-compute[62208]: warnings.warn( [ 2587.191092] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0d5740f-82f6-414d-b2b8-5ea493492322 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2587.195322] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2587.195322] nova-compute[62208]: warnings.warn( [ 2587.227204] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181973MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2587.227521] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2587.227793] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2587.284957] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 75ca5bb3-c856-4548-924f-3ff3614b0f63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2587.285204] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 655e577b-5034-4669-8fbd-8495671dd385 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2587.285387] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e5ed059f-0390-480e-bafe-17092f272131 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2587.285573] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c5f6249c-4435-4aad-99ce-f426c042a24a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2587.285804] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2587.285993] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2587.359641] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6421bd9e-ea7a-445f-af6b-4def4a18c948 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2587.362382] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2587.362382] nova-compute[62208]: warnings.warn( [ 2587.367865] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-826fa25c-7e88-48b5-b865-3d99ea3cda8b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2587.371438] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2587.371438] nova-compute[62208]: warnings.warn( [ 2587.401138] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e83f2167-7c2c-4ebe-8e26-29a508fefb25 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2587.403918] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2587.403918] nova-compute[62208]: warnings.warn( [ 2587.410168] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-435fbab9-ce0a-4376-98a8-8101d0d74a41 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2587.414470] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2587.414470] nova-compute[62208]: warnings.warn( [ 2587.428256] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2587.436679] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2587.454494] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2587.454699] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2588.455333] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2589.957314] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "c2db95e8-c625-4c06-bded-237af38df144" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2589.957689] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "c2db95e8-c625-4c06-bded-237af38df144" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2589.970300] nova-compute[62208]: DEBUG nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 2590.025346] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2590.025815] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2590.027553] nova-compute[62208]: INFO nova.compute.claims [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2590.150063] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31aa756c-9c7a-492f-850b-1989349472d9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2590.152770] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2590.152770] nova-compute[62208]: warnings.warn( [ 2590.158790] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3b187cc-54d7-4f45-aa45-6a33924d82b4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2590.161803] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2590.161803] nova-compute[62208]: warnings.warn( [ 2590.191146] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d37e5c69-7d47-4612-8124-f7fb59253ed7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2590.193723] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2590.193723] nova-compute[62208]: warnings.warn( [ 2590.199658] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06d2747c-3d3a-4551-93c6-e8704afb21f9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2590.204236] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2590.204236] nova-compute[62208]: warnings.warn( [ 2590.214975] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2590.224539] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2590.243712] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2590.243879] nova-compute[62208]: DEBUG nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 2590.280352] nova-compute[62208]: DEBUG nova.compute.utils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2590.281759] nova-compute[62208]: DEBUG nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2590.281931] nova-compute[62208]: DEBUG nova.network.neutron [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2590.295967] nova-compute[62208]: DEBUG nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 2590.333219] nova-compute[62208]: DEBUG nova.policy [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd51a1b598f0e44c28e2b6cdcbe7ac23e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a811291fa75242d5b998655672131068', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 2590.380211] nova-compute[62208]: DEBUG nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 2590.401502] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2590.401750] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2590.401907] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2590.402089] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2590.402236] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2590.402384] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2590.402617] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2590.403063] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2590.403063] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2590.403205] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2590.403286] nova-compute[62208]: DEBUG nova.virt.hardware [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2590.404164] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba89dd19-d452-4ab8-af5c-25533ed51c5f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2590.406636] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2590.406636] nova-compute[62208]: warnings.warn( [ 2590.412684] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aba33edb-75dc-4f03-a2f9-fb90316ea988 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2590.417682] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2590.417682] nova-compute[62208]: warnings.warn( [ 2590.699951] nova-compute[62208]: DEBUG nova.network.neutron [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Successfully created port: 0494459b-542c-41d5-a4a6-98e75b273bea {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2590.975590] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-04e13554-074f-4be1-a8aa-34fd7e3755ae tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "c5f6249c-4435-4aad-99ce-f426c042a24a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2591.510491] nova-compute[62208]: DEBUG nova.compute.manager [req-f0c6581b-220e-4528-9d0d-30f248a07de0 req-c77ad7c5-7a3c-428c-9264-e5f0aa473aec service nova] [instance: c2db95e8-c625-4c06-bded-237af38df144] Received event network-vif-plugged-0494459b-542c-41d5-a4a6-98e75b273bea {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2591.510862] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f0c6581b-220e-4528-9d0d-30f248a07de0 req-c77ad7c5-7a3c-428c-9264-e5f0aa473aec service nova] Acquiring lock "c2db95e8-c625-4c06-bded-237af38df144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2591.510988] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f0c6581b-220e-4528-9d0d-30f248a07de0 req-c77ad7c5-7a3c-428c-9264-e5f0aa473aec service nova] Lock "c2db95e8-c625-4c06-bded-237af38df144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2591.511230] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f0c6581b-220e-4528-9d0d-30f248a07de0 req-c77ad7c5-7a3c-428c-9264-e5f0aa473aec service nova] Lock "c2db95e8-c625-4c06-bded-237af38df144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2591.511459] nova-compute[62208]: DEBUG nova.compute.manager [req-f0c6581b-220e-4528-9d0d-30f248a07de0 req-c77ad7c5-7a3c-428c-9264-e5f0aa473aec service nova] [instance: c2db95e8-c625-4c06-bded-237af38df144] No waiting events found dispatching network-vif-plugged-0494459b-542c-41d5-a4a6-98e75b273bea {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2591.511837] nova-compute[62208]: WARNING nova.compute.manager [req-f0c6581b-220e-4528-9d0d-30f248a07de0 req-c77ad7c5-7a3c-428c-9264-e5f0aa473aec service nova] [instance: c2db95e8-c625-4c06-bded-237af38df144] Received unexpected event network-vif-plugged-0494459b-542c-41d5-a4a6-98e75b273bea for instance with vm_state building and task_state spawning. [ 2591.560722] nova-compute[62208]: DEBUG nova.network.neutron [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Successfully updated port: 0494459b-542c-41d5-a4a6-98e75b273bea {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2591.573435] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "refresh_cache-c2db95e8-c625-4c06-bded-237af38df144" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2591.573911] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquired lock "refresh_cache-c2db95e8-c625-4c06-bded-237af38df144" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2591.574281] nova-compute[62208]: DEBUG nova.network.neutron [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2591.661341] nova-compute[62208]: DEBUG nova.network.neutron [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2591.816638] nova-compute[62208]: DEBUG nova.network.neutron [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Updating instance_info_cache with network_info: [{"id": "0494459b-542c-41d5-a4a6-98e75b273bea", "address": "fa:16:3e:24:c7:a0", "network": {"id": "3920080a-4a37-46bb-98d8-615215934385", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-635270513-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a811291fa75242d5b998655672131068", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e885ebd4-93ca-4e9e-8889-0f16bd91e61e", "external-id": "nsx-vlan-transportzone-580", "segmentation_id": 580, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0494459b-54", "ovs_interfaceid": "0494459b-542c-41d5-a4a6-98e75b273bea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2591.830937] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Releasing lock "refresh_cache-c2db95e8-c625-4c06-bded-237af38df144" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2591.831351] nova-compute[62208]: DEBUG nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Instance network_info: |[{"id": "0494459b-542c-41d5-a4a6-98e75b273bea", "address": "fa:16:3e:24:c7:a0", "network": {"id": "3920080a-4a37-46bb-98d8-615215934385", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-635270513-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a811291fa75242d5b998655672131068", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e885ebd4-93ca-4e9e-8889-0f16bd91e61e", "external-id": "nsx-vlan-transportzone-580", "segmentation_id": 580, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0494459b-54", "ovs_interfaceid": "0494459b-542c-41d5-a4a6-98e75b273bea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 2591.832121] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:24:c7:a0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e885ebd4-93ca-4e9e-8889-0f16bd91e61e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0494459b-542c-41d5-a4a6-98e75b273bea', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2591.839882] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Creating folder: Project (a811291fa75242d5b998655672131068). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2591.840930] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d8410c4f-3fd4-47f2-9e28-6f9d6aae23ea {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2591.842675] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2591.842675] nova-compute[62208]: warnings.warn( [ 2591.851305] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Created folder: Project (a811291fa75242d5b998655672131068) in parent group-v17427. [ 2591.851586] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Creating folder: Instances. Parent ref: group-v17571. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2591.851879] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-eca72a50-d999-4521-baf9-0ad159a9b50d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2591.853501] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2591.853501] nova-compute[62208]: warnings.warn( [ 2591.860651] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Created folder: Instances in parent group-v17571. [ 2591.860964] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2591.861211] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c2db95e8-c625-4c06-bded-237af38df144] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2591.861490] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4979bab0-2f39-42ce-b081-32b5106ca46b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2591.875925] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2591.875925] nova-compute[62208]: warnings.warn( [ 2591.881728] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2591.881728] nova-compute[62208]: value = "task-38716" [ 2591.881728] nova-compute[62208]: _type = "Task" [ 2591.881728] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2591.885071] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2591.885071] nova-compute[62208]: warnings.warn( [ 2591.891066] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38716, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2592.386399] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2592.386399] nova-compute[62208]: warnings.warn( [ 2592.392702] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38716, 'name': CreateVM_Task, 'duration_secs': 0.331191} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2592.392986] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c2db95e8-c625-4c06-bded-237af38df144] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2592.400074] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2592.400474] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2592.403474] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3ba7bbf-2f15-4f75-81ef-19e8b043986b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2592.413784] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2592.413784] nova-compute[62208]: warnings.warn( [ 2592.433523] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Reconfiguring VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2592.433943] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-7acb7170-9476-48b5-a79a-bb49df8d457b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2592.444139] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2592.444139] nova-compute[62208]: warnings.warn( [ 2592.449145] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Waiting for the task: (returnval){ [ 2592.449145] nova-compute[62208]: value = "task-38717" [ 2592.449145] nova-compute[62208]: _type = "Task" [ 2592.449145] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2592.452196] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2592.452196] nova-compute[62208]: warnings.warn( [ 2592.457345] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': task-38717, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2592.953626] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2592.953626] nova-compute[62208]: warnings.warn( [ 2592.959559] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': task-38717, 'name': ReconfigVM_Task, 'duration_secs': 0.113377} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2592.959923] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Reconfigured VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2592.960220] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.560s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2592.960639] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2592.960922] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2592.961301] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2592.961612] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6cb3db1b-81d2-462f-8a6d-4f1d5a86cbc5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2592.963136] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2592.963136] nova-compute[62208]: warnings.warn( [ 2592.966138] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Waiting for the task: (returnval){ [ 2592.966138] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f94d35-d648-3035-8d96-4f4db6a3851c" [ 2592.966138] nova-compute[62208]: _type = "Task" [ 2592.966138] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2592.969107] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2592.969107] nova-compute[62208]: warnings.warn( [ 2592.974121] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52f94d35-d648-3035-8d96-4f4db6a3851c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2593.471191] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2593.471191] nova-compute[62208]: warnings.warn( [ 2593.477254] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2593.477494] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2593.477713] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2593.534819] nova-compute[62208]: DEBUG nova.compute.manager [req-73e285bf-c68e-48c4-beaf-a1631d8dd192 req-6c176180-8619-424c-b472-6f73982c59fa service nova] [instance: c2db95e8-c625-4c06-bded-237af38df144] Received event network-changed-0494459b-542c-41d5-a4a6-98e75b273bea {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2593.534998] nova-compute[62208]: DEBUG nova.compute.manager [req-73e285bf-c68e-48c4-beaf-a1631d8dd192 req-6c176180-8619-424c-b472-6f73982c59fa service nova] [instance: c2db95e8-c625-4c06-bded-237af38df144] Refreshing instance network info cache due to event network-changed-0494459b-542c-41d5-a4a6-98e75b273bea. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 2593.535536] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-73e285bf-c68e-48c4-beaf-a1631d8dd192 req-6c176180-8619-424c-b472-6f73982c59fa service nova] Acquiring lock "refresh_cache-c2db95e8-c625-4c06-bded-237af38df144" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2593.535688] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-73e285bf-c68e-48c4-beaf-a1631d8dd192 req-6c176180-8619-424c-b472-6f73982c59fa service nova] Acquired lock "refresh_cache-c2db95e8-c625-4c06-bded-237af38df144" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2593.535859] nova-compute[62208]: DEBUG nova.network.neutron [req-73e285bf-c68e-48c4-beaf-a1631d8dd192 req-6c176180-8619-424c-b472-6f73982c59fa service nova] [instance: c2db95e8-c625-4c06-bded-237af38df144] Refreshing network info cache for port 0494459b-542c-41d5-a4a6-98e75b273bea {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2593.752254] nova-compute[62208]: DEBUG nova.network.neutron [req-73e285bf-c68e-48c4-beaf-a1631d8dd192 req-6c176180-8619-424c-b472-6f73982c59fa service nova] [instance: c2db95e8-c625-4c06-bded-237af38df144] Updated VIF entry in instance network info cache for port 0494459b-542c-41d5-a4a6-98e75b273bea. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2593.752613] nova-compute[62208]: DEBUG nova.network.neutron [req-73e285bf-c68e-48c4-beaf-a1631d8dd192 req-6c176180-8619-424c-b472-6f73982c59fa service nova] [instance: c2db95e8-c625-4c06-bded-237af38df144] Updating instance_info_cache with network_info: [{"id": "0494459b-542c-41d5-a4a6-98e75b273bea", "address": "fa:16:3e:24:c7:a0", "network": {"id": "3920080a-4a37-46bb-98d8-615215934385", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-635270513-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a811291fa75242d5b998655672131068", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e885ebd4-93ca-4e9e-8889-0f16bd91e61e", "external-id": "nsx-vlan-transportzone-580", "segmentation_id": 580, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0494459b-54", "ovs_interfaceid": "0494459b-542c-41d5-a4a6-98e75b273bea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2593.763717] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-73e285bf-c68e-48c4-beaf-a1631d8dd192 req-6c176180-8619-424c-b472-6f73982c59fa service nova] Releasing lock "refresh_cache-c2db95e8-c625-4c06-bded-237af38df144" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2599.137097] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2630.994181] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2630.994181] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2630.994181] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2630.994181] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2630.994181] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2630.994181] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2630.994181] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2630.994181] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2630.994181] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2630.994181] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2630.994181] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2630.994181] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2630.994181] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/3a9bea9a-e4fb-4593-8b89-e40794901608/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2630.994976] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2630.994976] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Copying Virtual Disk [datastore2] vmware_temp/3a9bea9a-e4fb-4593-8b89-e40794901608/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/3a9bea9a-e4fb-4593-8b89-e40794901608/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2630.995211] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-22819f43-57c5-4d2c-a0b1-d6d24cfeeb06 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2630.997679] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2630.997679] nova-compute[62208]: warnings.warn( [ 2631.004926] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 2631.004926] nova-compute[62208]: value = "task-38718" [ 2631.004926] nova-compute[62208]: _type = "Task" [ 2631.004926] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2631.008338] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2631.008338] nova-compute[62208]: warnings.warn( [ 2631.013526] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38718, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2631.510572] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2631.510572] nova-compute[62208]: warnings.warn( [ 2631.516354] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2631.516733] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2631.517331] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Traceback (most recent call last): [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] yield resources [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] self.driver.spawn(context, instance, image_meta, [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] self._fetch_image_if_missing(context, vi) [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] image_cache(vi, tmp_image_ds_loc) [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] vm_util.copy_virtual_disk( [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] session._wait_for_task(vmdk_copy_task) [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] return self.wait_for_task(task_ref) [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] return evt.wait() [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] result = hub.switch() [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] return self.greenlet.switch() [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] self.f(*self.args, **self.kw) [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] raise exceptions.translate_fault(task_info.error) [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Faults: ['InvalidArgument'] [ 2631.517331] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] [ 2631.518503] nova-compute[62208]: INFO nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Terminating instance [ 2631.519801] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2631.520060] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2631.520351] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0e6d3ce0-df87-4f9b-8b53-478dfcca3ca3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2631.522674] nova-compute[62208]: DEBUG nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2631.522924] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2631.523715] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c11f295-5815-4c43-aab5-847ce3eccb04 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2631.526428] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2631.526428] nova-compute[62208]: warnings.warn( [ 2631.526878] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2631.526878] nova-compute[62208]: warnings.warn( [ 2631.531350] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2631.531676] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-97477e06-e8be-40d7-bc2e-3775f7fea575 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2631.534074] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2631.534292] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2631.534910] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2631.534910] nova-compute[62208]: warnings.warn( [ 2631.535366] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7ac824ca-4009-484e-adb0-b32fba7efafa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2631.537388] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2631.537388] nova-compute[62208]: warnings.warn( [ 2631.541228] nova-compute[62208]: DEBUG oslo_vmware.api [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2631.541228] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52104c44-b519-9078-2f82-678ae872d9ec" [ 2631.541228] nova-compute[62208]: _type = "Task" [ 2631.541228] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2631.544968] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2631.544968] nova-compute[62208]: warnings.warn( [ 2631.554400] nova-compute[62208]: DEBUG oslo_vmware.api [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52104c44-b519-9078-2f82-678ae872d9ec, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2631.598278] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2631.598654] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2631.598894] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Deleting the datastore file [datastore2] 75ca5bb3-c856-4548-924f-3ff3614b0f63 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2631.599210] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-eae9b86b-e5e4-42be-a4e9-e70096fdbd01 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2631.601180] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2631.601180] nova-compute[62208]: warnings.warn( [ 2631.605978] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for the task: (returnval){ [ 2631.605978] nova-compute[62208]: value = "task-38720" [ 2631.605978] nova-compute[62208]: _type = "Task" [ 2631.605978] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2631.609543] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2631.609543] nova-compute[62208]: warnings.warn( [ 2631.614497] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38720, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2632.045741] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.045741] nova-compute[62208]: warnings.warn( [ 2632.052258] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2632.052722] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating directory with path [datastore2] vmware_temp/82cd3055-bfcd-4a64-a9ab-9ef6886b799c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2632.053084] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7df6836b-88db-47b4-bc06-7752b854c000 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2632.054907] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.054907] nova-compute[62208]: warnings.warn( [ 2632.065242] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Created directory with path [datastore2] vmware_temp/82cd3055-bfcd-4a64-a9ab-9ef6886b799c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2632.065662] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Fetch image to [datastore2] vmware_temp/82cd3055-bfcd-4a64-a9ab-9ef6886b799c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2632.065978] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/82cd3055-bfcd-4a64-a9ab-9ef6886b799c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2632.066922] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1a527e6-6916-477b-914b-afcdd7f8a798 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2632.069509] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.069509] nova-compute[62208]: warnings.warn( [ 2632.074533] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0e41f43-11cf-4137-998c-cb222b9f7531 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2632.077060] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.077060] nova-compute[62208]: warnings.warn( [ 2632.085373] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dd7f8cf-245d-49d2-beb2-862d6fd45ac8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2632.089284] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.089284] nova-compute[62208]: warnings.warn( [ 2632.121411] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-440d1e54-7a70-4a40-985d-da3e00bcc1fb {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2632.123880] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.123880] nova-compute[62208]: warnings.warn( [ 2632.124500] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.124500] nova-compute[62208]: warnings.warn( [ 2632.129899] nova-compute[62208]: DEBUG oslo_vmware.api [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Task: {'id': task-38720, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076413} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2632.131653] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2632.131990] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2632.132305] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2632.132615] nova-compute[62208]: INFO nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2632.134739] nova-compute[62208]: DEBUG nova.compute.claims [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Aborting claim: <nova.compute.claims.Claim object at 0x7fb93695f2e0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2632.135043] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2632.135382] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2632.137961] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bcc0126a-2db6-4a28-9a02-a9ee0b1df7bf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2632.139988] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.139988] nova-compute[62208]: warnings.warn( [ 2632.160083] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2632.209624] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/82cd3055-bfcd-4a64-a9ab-9ef6886b799c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2632.266628] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2632.266825] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/82cd3055-bfcd-4a64-a9ab-9ef6886b799c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2632.302347] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c33bc20b-5c82-44b3-bd8d-5eb5852925b9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2632.306377] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.306377] nova-compute[62208]: warnings.warn( [ 2632.313973] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bad687f3-9dd4-4596-a47c-72bb6444136d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2632.317856] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.317856] nova-compute[62208]: warnings.warn( [ 2632.345839] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e1e2941-6965-4c1f-96ed-e29f1fe03693 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2632.348315] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.348315] nova-compute[62208]: warnings.warn( [ 2632.353621] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a566aea3-ec34-44af-9df6-b038f71eb35c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2632.357329] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.357329] nova-compute[62208]: warnings.warn( [ 2632.367251] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2632.379274] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2632.396117] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.261s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2632.396692] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Traceback (most recent call last): [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] self.driver.spawn(context, instance, image_meta, [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] self._fetch_image_if_missing(context, vi) [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] image_cache(vi, tmp_image_ds_loc) [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] vm_util.copy_virtual_disk( [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] session._wait_for_task(vmdk_copy_task) [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] return self.wait_for_task(task_ref) [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] return evt.wait() [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] result = hub.switch() [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] return self.greenlet.switch() [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] self.f(*self.args, **self.kw) [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] raise exceptions.translate_fault(task_info.error) [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Faults: ['InvalidArgument'] [ 2632.396692] nova-compute[62208]: ERROR nova.compute.manager [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] [ 2632.397856] nova-compute[62208]: DEBUG nova.compute.utils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2632.399126] nova-compute[62208]: DEBUG nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Build of instance 75ca5bb3-c856-4548-924f-3ff3614b0f63 was re-scheduled: A specified parameter was not correct: fileType [ 2632.399126] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2632.399521] nova-compute[62208]: DEBUG nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2632.399695] nova-compute[62208]: DEBUG nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2632.399866] nova-compute[62208]: DEBUG nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2632.400043] nova-compute[62208]: DEBUG nova.network.neutron [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2632.666318] nova-compute[62208]: DEBUG nova.network.neutron [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2632.678227] nova-compute[62208]: INFO nova.compute.manager [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Took 0.28 seconds to deallocate network for instance. [ 2632.779412] nova-compute[62208]: INFO nova.scheduler.client.report [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Deleted allocations for instance 75ca5bb3-c856-4548-924f-3ff3614b0f63 [ 2632.804272] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-2b976214-04e8-42af-96fa-96768551d224 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "75ca5bb3-c856-4548-924f-3ff3614b0f63" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 385.413s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2632.804546] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "75ca5bb3-c856-4548-924f-3ff3614b0f63" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 228.407s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2632.804734] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] During sync_power_state the instance has a pending task (spawning). Skip. [ 2632.804907] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "75ca5bb3-c856-4548-924f-3ff3614b0f63" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2632.805125] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0863b084-991e-46a1-9efa-3d0feed04ac7 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "75ca5bb3-c856-4548-924f-3ff3614b0f63" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 190.120s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2632.805332] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0863b084-991e-46a1-9efa-3d0feed04ac7 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Acquiring lock "75ca5bb3-c856-4548-924f-3ff3614b0f63-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2632.805530] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0863b084-991e-46a1-9efa-3d0feed04ac7 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "75ca5bb3-c856-4548-924f-3ff3614b0f63-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2632.805694] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0863b084-991e-46a1-9efa-3d0feed04ac7 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "75ca5bb3-c856-4548-924f-3ff3614b0f63-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2632.807617] nova-compute[62208]: INFO nova.compute.manager [None req-0863b084-991e-46a1-9efa-3d0feed04ac7 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Terminating instance [ 2632.809324] nova-compute[62208]: DEBUG nova.compute.manager [None req-0863b084-991e-46a1-9efa-3d0feed04ac7 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2632.809537] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0863b084-991e-46a1-9efa-3d0feed04ac7 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2632.810047] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2fe5a3ce-4672-404a-9ef5-023134503b7f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2632.811887] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.811887] nova-compute[62208]: warnings.warn( [ 2632.820205] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86da830d-0501-447e-b598-43764db0c1e8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2632.830246] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2632.830246] nova-compute[62208]: warnings.warn( [ 2632.845711] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-0863b084-991e-46a1-9efa-3d0feed04ac7 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 75ca5bb3-c856-4548-924f-3ff3614b0f63 could not be found. [ 2632.845947] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-0863b084-991e-46a1-9efa-3d0feed04ac7 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2632.846132] nova-compute[62208]: INFO nova.compute.manager [None req-0863b084-991e-46a1-9efa-3d0feed04ac7 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2632.846382] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-0863b084-991e-46a1-9efa-3d0feed04ac7 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2632.846882] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2632.846978] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2632.879260] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2632.887855] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 75ca5bb3-c856-4548-924f-3ff3614b0f63] Took 0.04 seconds to deallocate network for instance. [ 2632.977604] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-0863b084-991e-46a1-9efa-3d0feed04ac7 tempest-ServerDiskConfigTestJSON-626819497 tempest-ServerDiskConfigTestJSON-626819497-project-member] Lock "75ca5bb3-c856-4548-924f-3ff3614b0f63" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.172s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2636.054147] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquiring lock "d238f196-f7e6-455b-b514-e8475e204e82" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2636.054560] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Lock "d238f196-f7e6-455b-b514-e8475e204e82" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2636.071335] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 2636.121203] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2636.121344] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2636.122940] nova-compute[62208]: INFO nova.compute.claims [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2636.141013] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2636.245855] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a15d69e-f993-43a6-96f3-231df1c58d73 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2636.248519] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2636.248519] nova-compute[62208]: warnings.warn( [ 2636.253707] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fc66245-4b6c-44df-ae66-119b7ecf31ce {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2636.257001] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2636.257001] nova-compute[62208]: warnings.warn( [ 2636.284240] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3f9fd57-868f-48e6-86bb-9a8ab9c2c435 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2636.286604] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2636.286604] nova-compute[62208]: warnings.warn( [ 2636.292313] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c10612f7-674e-4823-bc12-d6648c7fc150 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2636.295962] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2636.295962] nova-compute[62208]: warnings.warn( [ 2636.306123] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2636.315303] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2636.331716] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.210s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2636.331716] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 2636.367469] nova-compute[62208]: DEBUG nova.compute.utils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2636.368817] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Not allocating networking since 'none' was specified. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1968}} [ 2636.380396] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 2636.448895] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 2636.470260] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2636.470495] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2636.470654] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2636.470836] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2636.470983] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2636.471129] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2636.471334] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2636.471563] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2636.471760] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2636.471926] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2636.472119] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2636.472983] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70d32e3d-c31f-4c8e-a312-a056c6dcf305 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2636.475438] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2636.475438] nova-compute[62208]: warnings.warn( [ 2636.481079] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99d9bcdb-4542-419a-8396-665f54b1f6ec {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2636.485055] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2636.485055] nova-compute[62208]: warnings.warn( [ 2636.495219] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2636.500618] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Creating folder: Project (bbaa66fa089d4263952b55efbd6b9897). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2636.500886] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8ae264cd-2b1d-48a8-8529-4148b7eb7389 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2636.502263] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2636.502263] nova-compute[62208]: warnings.warn( [ 2636.511516] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Created folder: Project (bbaa66fa089d4263952b55efbd6b9897) in parent group-v17427. [ 2636.511695] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Creating folder: Instances. Parent ref: group-v17574. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2636.511926] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1d1b7284-97ab-459e-a179-e9857b6c55d8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2636.513394] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2636.513394] nova-compute[62208]: warnings.warn( [ 2636.520903] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Created folder: Instances in parent group-v17574. [ 2636.521142] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2636.521329] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2636.521531] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-28550c48-4f19-4a85-8ef0-2a4f9536bd02 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2636.533898] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2636.533898] nova-compute[62208]: warnings.warn( [ 2636.538533] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2636.538533] nova-compute[62208]: value = "task-38723" [ 2636.538533] nova-compute[62208]: _type = "Task" [ 2636.538533] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2636.541500] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2636.541500] nova-compute[62208]: warnings.warn( [ 2636.546285] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38723, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2637.043250] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2637.043250] nova-compute[62208]: warnings.warn( [ 2637.049247] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38723, 'name': CreateVM_Task, 'duration_secs': 0.256226} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2637.049424] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2637.049747] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2637.049981] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2637.052874] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7364cb7c-66f8-4d7c-bcca-eb6ca363eaf5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2637.063423] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2637.063423] nova-compute[62208]: warnings.warn( [ 2637.085231] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Reconfiguring VM instance to enable vnc on port - 5902 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2637.085962] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-3eae0186-301e-44aa-b16c-3de85bb0f76c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2637.096604] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2637.096604] nova-compute[62208]: warnings.warn( [ 2637.102429] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Waiting for the task: (returnval){ [ 2637.102429] nova-compute[62208]: value = "task-38724" [ 2637.102429] nova-compute[62208]: _type = "Task" [ 2637.102429] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2637.105357] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2637.105357] nova-compute[62208]: warnings.warn( [ 2637.112119] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Task: {'id': task-38724, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2637.606852] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2637.606852] nova-compute[62208]: warnings.warn( [ 2637.612876] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Task: {'id': task-38724, 'name': ReconfigVM_Task, 'duration_secs': 0.120719} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2637.613188] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Reconfigured VM instance to enable vnc on port - 5902 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2637.613405] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.563s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2637.613654] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2637.613798] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2637.614111] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2637.614367] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3c064ecf-2a86-4535-81ac-a2883736ee96 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2637.615900] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2637.615900] nova-compute[62208]: warnings.warn( [ 2637.619100] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Waiting for the task: (returnval){ [ 2637.619100] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5248407a-7213-1cc8-cbd8-6539381af9ba" [ 2637.619100] nova-compute[62208]: _type = "Task" [ 2637.619100] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2637.622091] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2637.622091] nova-compute[62208]: warnings.warn( [ 2637.626665] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5248407a-7213-1cc8-cbd8-6539381af9ba, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2638.123184] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2638.123184] nova-compute[62208]: warnings.warn( [ 2638.130660] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2638.130912] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2638.131119] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2639.141368] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2639.141751] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2639.141751] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2639.156251] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2639.156436] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e5ed059f-0390-480e-bafe-17092f272131] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2639.156582] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2639.156703] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c2db95e8-c625-4c06-bded-237af38df144] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2639.156827] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2639.156945] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2640.141335] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2644.142144] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2645.141187] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2646.135990] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2646.140736] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2646.140898] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2648.141637] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2649.141269] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2649.152429] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2649.152663] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2649.152822] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2649.152976] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2649.154113] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39187dfa-72fe-4df9-afe8-b1cbdf4b4019 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2649.157299] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2649.157299] nova-compute[62208]: warnings.warn( [ 2649.163098] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8eee6c57-328d-4367-9be7-88800a254b1f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2649.166819] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2649.166819] nova-compute[62208]: warnings.warn( [ 2649.177043] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7deeb4a-c6c4-49a3-81e4-ebd32e7d3c91 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2649.179245] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2649.179245] nova-compute[62208]: warnings.warn( [ 2649.183729] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40cb0e22-d9e8-4e97-a6f6-de2ca35a57ba {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2649.186643] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2649.186643] nova-compute[62208]: warnings.warn( [ 2649.212720] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181973MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2649.212852] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2649.213042] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2649.320065] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 655e577b-5034-4669-8fbd-8495671dd385 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2649.320244] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e5ed059f-0390-480e-bafe-17092f272131 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2649.320374] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c5f6249c-4435-4aad-99ce-f426c042a24a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2649.320506] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c2db95e8-c625-4c06-bded-237af38df144 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2649.320630] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d238f196-f7e6-455b-b514-e8475e204e82 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2649.320854] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2649.320997] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2649.337196] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing inventories for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2649.349156] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating ProviderTree inventory for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2649.349346] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating inventory in ProviderTree for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2649.360179] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing aggregate associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, aggregates: None {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2649.376224] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing trait associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2649.445195] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66186924-8631-48de-baa1-47b2ed966d43 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2649.448214] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2649.448214] nova-compute[62208]: warnings.warn( [ 2649.453527] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a34ed2c-be32-4e68-93a9-a8b33e497ad2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2649.456262] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2649.456262] nova-compute[62208]: warnings.warn( [ 2649.483823] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d979cfa-ee5a-4351-93be-f4635d94b3d1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2649.486089] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2649.486089] nova-compute[62208]: warnings.warn( [ 2649.491190] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ecf59c3-1586-41bc-b9dd-0e5bb3754f96 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2649.494683] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2649.494683] nova-compute[62208]: warnings.warn( [ 2649.503874] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2649.512954] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2649.529839] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2649.530045] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.317s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2652.141036] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2652.141460] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances with incomplete migration {{(pid=62208) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11257}} [ 2657.151261] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2657.151656] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11219}} [ 2657.160636] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] There are 0 instances to clean {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11228}} [ 2669.144661] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2680.196637] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2680.196637] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2680.196637] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2680.196637] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2680.196637] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2680.196637] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2680.196637] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2680.196637] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2680.196637] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2680.196637] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2680.196637] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2680.196637] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2680.197340] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/82cd3055-bfcd-4a64-a9ab-9ef6886b799c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2680.199342] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2680.199584] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Copying Virtual Disk [datastore2] vmware_temp/82cd3055-bfcd-4a64-a9ab-9ef6886b799c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/82cd3055-bfcd-4a64-a9ab-9ef6886b799c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2680.199871] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1c434686-4891-4070-805c-67ac97ee803e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2680.202052] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2680.202052] nova-compute[62208]: warnings.warn( [ 2680.207799] nova-compute[62208]: DEBUG oslo_vmware.api [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2680.207799] nova-compute[62208]: value = "task-38725" [ 2680.207799] nova-compute[62208]: _type = "Task" [ 2680.207799] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2680.212918] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2680.212918] nova-compute[62208]: warnings.warn( [ 2680.218414] nova-compute[62208]: DEBUG oslo_vmware.api [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38725, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2680.712222] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2680.712222] nova-compute[62208]: warnings.warn( [ 2680.718126] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2680.718430] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2680.718989] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] Traceback (most recent call last): [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] yield resources [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] self.driver.spawn(context, instance, image_meta, [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] self._fetch_image_if_missing(context, vi) [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] image_cache(vi, tmp_image_ds_loc) [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] vm_util.copy_virtual_disk( [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] session._wait_for_task(vmdk_copy_task) [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] return self.wait_for_task(task_ref) [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] return evt.wait() [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] result = hub.switch() [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] return self.greenlet.switch() [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] self.f(*self.args, **self.kw) [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] raise exceptions.translate_fault(task_info.error) [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] Faults: ['InvalidArgument'] [ 2680.718989] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] [ 2680.720120] nova-compute[62208]: INFO nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Terminating instance [ 2680.720949] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2680.721148] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2680.721386] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4b98fcce-14d9-43c5-9296-af64114ce17b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2680.723866] nova-compute[62208]: DEBUG nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2680.724065] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2680.724801] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5be4df52-edd7-428c-acab-3478667a4f54 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2680.727204] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2680.727204] nova-compute[62208]: warnings.warn( [ 2680.727544] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2680.727544] nova-compute[62208]: warnings.warn( [ 2680.732022] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2680.732247] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c1f49e46-2230-42c0-b3f1-9ae11e6dd2a7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2680.734616] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2680.734793] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2680.735363] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2680.735363] nova-compute[62208]: warnings.warn( [ 2680.735860] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6af5ae23-bdc5-4e62-8663-029eecf0f57a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2680.738121] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2680.738121] nova-compute[62208]: warnings.warn( [ 2680.741095] nova-compute[62208]: DEBUG oslo_vmware.api [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Waiting for the task: (returnval){ [ 2680.741095] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e95de9-a5a9-01a1-1806-b6d57be0530d" [ 2680.741095] nova-compute[62208]: _type = "Task" [ 2680.741095] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2680.744038] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2680.744038] nova-compute[62208]: warnings.warn( [ 2680.749057] nova-compute[62208]: DEBUG oslo_vmware.api [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52e95de9-a5a9-01a1-1806-b6d57be0530d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2680.806530] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2680.806803] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2680.806961] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleting the datastore file [datastore2] 655e577b-5034-4669-8fbd-8495671dd385 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2680.807246] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ac45a15a-4ff3-4157-945e-225b148a8799 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2680.809173] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2680.809173] nova-compute[62208]: warnings.warn( [ 2680.825821] nova-compute[62208]: DEBUG oslo_vmware.api [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2680.825821] nova-compute[62208]: value = "task-38727" [ 2680.825821] nova-compute[62208]: _type = "Task" [ 2680.825821] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2680.829712] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2680.829712] nova-compute[62208]: warnings.warn( [ 2680.835271] nova-compute[62208]: DEBUG oslo_vmware.api [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38727, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2681.245093] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2681.245093] nova-compute[62208]: warnings.warn( [ 2681.251539] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2681.251790] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Creating directory with path [datastore2] vmware_temp/f800162d-04fc-4c67-8fe8-5dab7d3a8402/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2681.252069] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ba1a8c92-48c3-4073-b034-376635994a7e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2681.253725] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2681.253725] nova-compute[62208]: warnings.warn( [ 2681.264673] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Created directory with path [datastore2] vmware_temp/f800162d-04fc-4c67-8fe8-5dab7d3a8402/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2681.264874] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Fetch image to [datastore2] vmware_temp/f800162d-04fc-4c67-8fe8-5dab7d3a8402/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2681.265044] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/f800162d-04fc-4c67-8fe8-5dab7d3a8402/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2681.265823] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-660080a9-9cae-4a87-a345-e69d1f2b6a6c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2681.268101] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2681.268101] nova-compute[62208]: warnings.warn( [ 2681.272819] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77e9b6d0-f180-4141-8417-c5fe3fc0ed22 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2681.274966] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2681.274966] nova-compute[62208]: warnings.warn( [ 2681.281810] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3129fd7e-175e-421e-bf99-a763fcde68b6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2681.285227] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2681.285227] nova-compute[62208]: warnings.warn( [ 2681.314509] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90b8522d-447a-4059-9b8a-45a48bfc3fb0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2681.316843] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2681.316843] nova-compute[62208]: warnings.warn( [ 2681.320682] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6264926d-72a0-4737-84fd-265a06ca7354 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2681.322237] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2681.322237] nova-compute[62208]: warnings.warn( [ 2681.329406] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2681.329406] nova-compute[62208]: warnings.warn( [ 2681.334444] nova-compute[62208]: DEBUG oslo_vmware.api [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38727, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075516} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2681.334684] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2681.334859] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2681.335026] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2681.335250] nova-compute[62208]: INFO nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2681.337430] nova-compute[62208]: DEBUG nova.compute.claims [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9369deec0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2681.337599] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2681.337810] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2681.344055] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2681.454493] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-907471e1-f1b0-4888-9e41-2c75e03e322b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2681.457620] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f800162d-04fc-4c67-8fe8-5dab7d3a8402/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2681.458721] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2681.458721] nova-compute[62208]: warnings.warn( [ 2681.511970] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ded8e5d7-e648-4ef7-9036-5d6e59752e49 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2681.516319] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2681.516319] nova-compute[62208]: warnings.warn( [ 2681.517446] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2681.517624] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f800162d-04fc-4c67-8fe8-5dab7d3a8402/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2681.543849] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78f7144f-1a73-4be1-a8dd-e00617b2330c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2681.546228] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2681.546228] nova-compute[62208]: warnings.warn( [ 2681.551516] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c24cce6f-e914-4d69-982e-520ec2a7af2a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2681.555046] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2681.555046] nova-compute[62208]: warnings.warn( [ 2681.564392] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2681.573081] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2681.609803] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.272s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2681.610314] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] Traceback (most recent call last): [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] self.driver.spawn(context, instance, image_meta, [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] self._fetch_image_if_missing(context, vi) [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] image_cache(vi, tmp_image_ds_loc) [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] vm_util.copy_virtual_disk( [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] session._wait_for_task(vmdk_copy_task) [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] return self.wait_for_task(task_ref) [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] return evt.wait() [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] result = hub.switch() [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] return self.greenlet.switch() [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] self.f(*self.args, **self.kw) [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] raise exceptions.translate_fault(task_info.error) [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] Faults: ['InvalidArgument'] [ 2681.610314] nova-compute[62208]: ERROR nova.compute.manager [instance: 655e577b-5034-4669-8fbd-8495671dd385] [ 2681.611245] nova-compute[62208]: DEBUG nova.compute.utils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2681.612923] nova-compute[62208]: DEBUG nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Build of instance 655e577b-5034-4669-8fbd-8495671dd385 was re-scheduled: A specified parameter was not correct: fileType [ 2681.612923] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2681.613284] nova-compute[62208]: DEBUG nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2681.613459] nova-compute[62208]: DEBUG nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2681.613629] nova-compute[62208]: DEBUG nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2681.613794] nova-compute[62208]: DEBUG nova.network.neutron [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2681.859405] nova-compute[62208]: DEBUG nova.network.neutron [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2681.871788] nova-compute[62208]: INFO nova.compute.manager [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Took 0.26 seconds to deallocate network for instance. [ 2681.971384] nova-compute[62208]: INFO nova.scheduler.client.report [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleted allocations for instance 655e577b-5034-4669-8fbd-8495671dd385 [ 2681.992429] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-15772db2-d8b0-4b2a-901c-1ca94b09996f tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "655e577b-5034-4669-8fbd-8495671dd385" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 335.087s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2681.992716] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "655e577b-5034-4669-8fbd-8495671dd385" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 277.595s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2681.993065] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 655e577b-5034-4669-8fbd-8495671dd385] During sync_power_state the instance has a pending task (spawning). Skip. [ 2681.993065] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "655e577b-5034-4669-8fbd-8495671dd385" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2681.993552] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-dd452887-cc6a-4c29-bdd2-5a6aa9fe1698 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "655e577b-5034-4669-8fbd-8495671dd385" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 138.499s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2681.993828] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-dd452887-cc6a-4c29-bdd2-5a6aa9fe1698 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "655e577b-5034-4669-8fbd-8495671dd385-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2681.993996] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-dd452887-cc6a-4c29-bdd2-5a6aa9fe1698 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "655e577b-5034-4669-8fbd-8495671dd385-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2681.994133] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-dd452887-cc6a-4c29-bdd2-5a6aa9fe1698 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "655e577b-5034-4669-8fbd-8495671dd385-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2681.996094] nova-compute[62208]: INFO nova.compute.manager [None req-dd452887-cc6a-4c29-bdd2-5a6aa9fe1698 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Terminating instance [ 2681.997931] nova-compute[62208]: DEBUG nova.compute.manager [None req-dd452887-cc6a-4c29-bdd2-5a6aa9fe1698 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2681.998110] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-dd452887-cc6a-4c29-bdd2-5a6aa9fe1698 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2681.998364] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-26cc1feb-7c6c-4ebc-be4a-23c1641949c0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2682.001010] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2682.001010] nova-compute[62208]: warnings.warn( [ 2682.008200] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2248d9f-d61b-4067-82fe-6be85cb828df {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2682.018611] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2682.018611] nova-compute[62208]: warnings.warn( [ 2682.033439] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-dd452887-cc6a-4c29-bdd2-5a6aa9fe1698 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 655e577b-5034-4669-8fbd-8495671dd385 could not be found. [ 2682.033650] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-dd452887-cc6a-4c29-bdd2-5a6aa9fe1698 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2682.033832] nova-compute[62208]: INFO nova.compute.manager [None req-dd452887-cc6a-4c29-bdd2-5a6aa9fe1698 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2682.034079] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-dd452887-cc6a-4c29-bdd2-5a6aa9fe1698 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2682.034302] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2682.034384] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 655e577b-5034-4669-8fbd-8495671dd385] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2682.064141] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2682.073835] nova-compute[62208]: INFO nova.compute.manager [-] [instance: 655e577b-5034-4669-8fbd-8495671dd385] Took 0.04 seconds to deallocate network for instance. [ 2682.164783] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-dd452887-cc6a-4c29-bdd2-5a6aa9fe1698 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "655e577b-5034-4669-8fbd-8495671dd385" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.171s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2683.010274] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "d5289eeb-c269-431e-9a8e-d27487e12b2a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2683.010517] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "d5289eeb-c269-431e-9a8e-d27487e12b2a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2683.021544] nova-compute[62208]: DEBUG nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 2683.073078] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2683.073335] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2683.074826] nova-compute[62208]: INFO nova.compute.claims [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2683.197358] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b5d1bbc-11ae-4d09-82ef-a5dc2490e40b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.199953] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2683.199953] nova-compute[62208]: warnings.warn( [ 2683.205250] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b383af77-d198-4baf-92a1-d7a52991d4c5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.209588] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2683.209588] nova-compute[62208]: warnings.warn( [ 2683.235793] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42403872-fcf8-4306-86f1-2fff77fd52db {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.238089] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2683.238089] nova-compute[62208]: warnings.warn( [ 2683.243662] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deb2ee4f-0d4c-4fde-ae39-302cc3fbb60f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.247180] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2683.247180] nova-compute[62208]: warnings.warn( [ 2683.257019] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2683.266800] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2683.289117] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.216s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2683.289611] nova-compute[62208]: DEBUG nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 2683.321771] nova-compute[62208]: DEBUG nova.compute.utils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2683.322992] nova-compute[62208]: DEBUG nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2683.323167] nova-compute[62208]: DEBUG nova.network.neutron [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2683.335043] nova-compute[62208]: DEBUG nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 2683.370676] nova-compute[62208]: DEBUG nova.policy [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7534a5a8a37e4451918e35c8b93d4ad5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8eef1e68dea42cf98f03dc8db29498a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 2683.400922] nova-compute[62208]: DEBUG nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 2683.421921] nova-compute[62208]: DEBUG nova.virt.hardware [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2683.422219] nova-compute[62208]: DEBUG nova.virt.hardware [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2683.422382] nova-compute[62208]: DEBUG nova.virt.hardware [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2683.422566] nova-compute[62208]: DEBUG nova.virt.hardware [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2683.422760] nova-compute[62208]: DEBUG nova.virt.hardware [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2683.422851] nova-compute[62208]: DEBUG nova.virt.hardware [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2683.423086] nova-compute[62208]: DEBUG nova.virt.hardware [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2683.423256] nova-compute[62208]: DEBUG nova.virt.hardware [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2683.423563] nova-compute[62208]: DEBUG nova.virt.hardware [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2683.423770] nova-compute[62208]: DEBUG nova.virt.hardware [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2683.424538] nova-compute[62208]: DEBUG nova.virt.hardware [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2683.425411] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2482850-444e-43fa-9673-0fe5e7b4fade {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.428275] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2683.428275] nova-compute[62208]: warnings.warn( [ 2683.434528] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4062bb1-d54d-44fa-a00f-d1237de684e2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.439555] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2683.439555] nova-compute[62208]: warnings.warn( [ 2683.615320] nova-compute[62208]: DEBUG nova.network.neutron [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Successfully created port: 7bb4d2cc-6c15-46f2-9750-703926d3a104 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2684.252109] nova-compute[62208]: DEBUG nova.compute.manager [req-c1edea1c-03a4-4adb-adec-0c5fca0f5bb3 req-eb13375e-6a0c-41be-9c31-c318e7dbd63a service nova] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Received event network-vif-plugged-7bb4d2cc-6c15-46f2-9750-703926d3a104 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2684.252109] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-c1edea1c-03a4-4adb-adec-0c5fca0f5bb3 req-eb13375e-6a0c-41be-9c31-c318e7dbd63a service nova] Acquiring lock "d5289eeb-c269-431e-9a8e-d27487e12b2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2684.252109] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-c1edea1c-03a4-4adb-adec-0c5fca0f5bb3 req-eb13375e-6a0c-41be-9c31-c318e7dbd63a service nova] Lock "d5289eeb-c269-431e-9a8e-d27487e12b2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2684.252109] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-c1edea1c-03a4-4adb-adec-0c5fca0f5bb3 req-eb13375e-6a0c-41be-9c31-c318e7dbd63a service nova] Lock "d5289eeb-c269-431e-9a8e-d27487e12b2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2684.252109] nova-compute[62208]: DEBUG nova.compute.manager [req-c1edea1c-03a4-4adb-adec-0c5fca0f5bb3 req-eb13375e-6a0c-41be-9c31-c318e7dbd63a service nova] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] No waiting events found dispatching network-vif-plugged-7bb4d2cc-6c15-46f2-9750-703926d3a104 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2684.252109] nova-compute[62208]: WARNING nova.compute.manager [req-c1edea1c-03a4-4adb-adec-0c5fca0f5bb3 req-eb13375e-6a0c-41be-9c31-c318e7dbd63a service nova] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Received unexpected event network-vif-plugged-7bb4d2cc-6c15-46f2-9750-703926d3a104 for instance with vm_state building and task_state spawning. [ 2684.324793] nova-compute[62208]: DEBUG nova.network.neutron [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Successfully updated port: 7bb4d2cc-6c15-46f2-9750-703926d3a104 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2684.337166] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "refresh_cache-d5289eeb-c269-431e-9a8e-d27487e12b2a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2684.337328] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "refresh_cache-d5289eeb-c269-431e-9a8e-d27487e12b2a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2684.337451] nova-compute[62208]: DEBUG nova.network.neutron [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2684.381880] nova-compute[62208]: DEBUG nova.network.neutron [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2684.533997] nova-compute[62208]: DEBUG nova.network.neutron [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Updating instance_info_cache with network_info: [{"id": "7bb4d2cc-6c15-46f2-9750-703926d3a104", "address": "fa:16:3e:ca:05:05", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7bb4d2cc-6c", "ovs_interfaceid": "7bb4d2cc-6c15-46f2-9750-703926d3a104", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2684.547893] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "refresh_cache-d5289eeb-c269-431e-9a8e-d27487e12b2a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2684.548225] nova-compute[62208]: DEBUG nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Instance network_info: |[{"id": "7bb4d2cc-6c15-46f2-9750-703926d3a104", "address": "fa:16:3e:ca:05:05", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7bb4d2cc-6c", "ovs_interfaceid": "7bb4d2cc-6c15-46f2-9750-703926d3a104", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 2684.548727] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ca:05:05', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'da623279-b6f6-4570-8b15-a332120b8b60', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7bb4d2cc-6c15-46f2-9750-703926d3a104', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2684.556385] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2684.557033] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2684.557274] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b0b912da-00f8-4b0b-b3f2-deeb4f3c5fd9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2684.572060] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2684.572060] nova-compute[62208]: warnings.warn( [ 2684.578090] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2684.578090] nova-compute[62208]: value = "task-38728" [ 2684.578090] nova-compute[62208]: _type = "Task" [ 2684.578090] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2684.581386] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2684.581386] nova-compute[62208]: warnings.warn( [ 2684.587067] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38728, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2685.083194] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2685.083194] nova-compute[62208]: warnings.warn( [ 2685.089631] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38728, 'name': CreateVM_Task, 'duration_secs': 0.327083} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2685.089810] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2685.090379] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2685.090598] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2685.093379] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc14063a-2bb0-4899-a72b-8477c3d82b88 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2685.103534] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2685.103534] nova-compute[62208]: warnings.warn( [ 2685.123636] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Reconfiguring VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2685.123979] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-b16f5ac1-c336-4d61-af26-b18d35259f41 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2685.134529] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2685.134529] nova-compute[62208]: warnings.warn( [ 2685.141191] nova-compute[62208]: DEBUG oslo_vmware.api [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2685.141191] nova-compute[62208]: value = "task-38729" [ 2685.141191] nova-compute[62208]: _type = "Task" [ 2685.141191] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2685.144634] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2685.144634] nova-compute[62208]: warnings.warn( [ 2685.152403] nova-compute[62208]: DEBUG oslo_vmware.api [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38729, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2685.645929] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2685.645929] nova-compute[62208]: warnings.warn( [ 2685.651825] nova-compute[62208]: DEBUG oslo_vmware.api [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38729, 'name': ReconfigVM_Task, 'duration_secs': 0.106489} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2685.652116] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Reconfigured VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2685.652330] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.562s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2685.652648] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2685.652816] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2685.653131] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2685.653390] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-623a0cf0-fc9b-47a0-ba48-0edfe5101839 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2685.655120] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2685.655120] nova-compute[62208]: warnings.warn( [ 2685.658163] nova-compute[62208]: DEBUG oslo_vmware.api [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2685.658163] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]527a15db-a3ca-e1fb-15c4-1dd6aee5bd23" [ 2685.658163] nova-compute[62208]: _type = "Task" [ 2685.658163] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2685.661077] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2685.661077] nova-compute[62208]: warnings.warn( [ 2685.665816] nova-compute[62208]: DEBUG oslo_vmware.api [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]527a15db-a3ca-e1fb-15c4-1dd6aee5bd23, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2686.162551] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2686.162551] nova-compute[62208]: warnings.warn( [ 2686.168993] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2686.169217] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2686.169427] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2686.276553] nova-compute[62208]: DEBUG nova.compute.manager [req-63375835-5f79-41ff-a8ef-9955a5321c23 req-ed917428-aa3a-4dd5-af8f-808dfb3982b7 service nova] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Received event network-changed-7bb4d2cc-6c15-46f2-9750-703926d3a104 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2686.276947] nova-compute[62208]: DEBUG nova.compute.manager [req-63375835-5f79-41ff-a8ef-9955a5321c23 req-ed917428-aa3a-4dd5-af8f-808dfb3982b7 service nova] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Refreshing instance network info cache due to event network-changed-7bb4d2cc-6c15-46f2-9750-703926d3a104. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 2686.277286] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-63375835-5f79-41ff-a8ef-9955a5321c23 req-ed917428-aa3a-4dd5-af8f-808dfb3982b7 service nova] Acquiring lock "refresh_cache-d5289eeb-c269-431e-9a8e-d27487e12b2a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2686.277559] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-63375835-5f79-41ff-a8ef-9955a5321c23 req-ed917428-aa3a-4dd5-af8f-808dfb3982b7 service nova] Acquired lock "refresh_cache-d5289eeb-c269-431e-9a8e-d27487e12b2a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2686.277842] nova-compute[62208]: DEBUG nova.network.neutron [req-63375835-5f79-41ff-a8ef-9955a5321c23 req-ed917428-aa3a-4dd5-af8f-808dfb3982b7 service nova] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Refreshing network info cache for port 7bb4d2cc-6c15-46f2-9750-703926d3a104 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2686.729863] nova-compute[62208]: DEBUG nova.network.neutron [req-63375835-5f79-41ff-a8ef-9955a5321c23 req-ed917428-aa3a-4dd5-af8f-808dfb3982b7 service nova] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Updated VIF entry in instance network info cache for port 7bb4d2cc-6c15-46f2-9750-703926d3a104. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2686.730657] nova-compute[62208]: DEBUG nova.network.neutron [req-63375835-5f79-41ff-a8ef-9955a5321c23 req-ed917428-aa3a-4dd5-af8f-808dfb3982b7 service nova] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Updating instance_info_cache with network_info: [{"id": "7bb4d2cc-6c15-46f2-9750-703926d3a104", "address": "fa:16:3e:ca:05:05", "network": {"id": "38939025-5e89-4f09-8e13-3a02a4138e76", "bridge": "br-int", "label": "tempest-ServersTestJSON-650006473-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "e8eef1e68dea42cf98f03dc8db29498a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da623279-b6f6-4570-8b15-a332120b8b60", "external-id": "nsx-vlan-transportzone-733", "segmentation_id": 733, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7bb4d2cc-6c", "ovs_interfaceid": "7bb4d2cc-6c15-46f2-9750-703926d3a104", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2686.740438] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-63375835-5f79-41ff-a8ef-9955a5321c23 req-ed917428-aa3a-4dd5-af8f-808dfb3982b7 service nova] Releasing lock "refresh_cache-d5289eeb-c269-431e-9a8e-d27487e12b2a" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2696.148311] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2699.144066] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2699.144441] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2699.144441] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2699.159701] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e5ed059f-0390-480e-bafe-17092f272131] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2699.159896] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2699.159990] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c2db95e8-c625-4c06-bded-237af38df144] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2699.160129] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2699.160255] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2699.160386] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2701.141391] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2705.140999] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2706.136661] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2707.140887] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2708.141343] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2708.141714] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2710.141490] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2710.141887] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2710.152750] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2710.152983] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2710.153147] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2710.153306] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2710.154423] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38c24bb0-2757-4e10-a36c-71e5c9eebee0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2710.157237] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2710.157237] nova-compute[62208]: warnings.warn( [ 2710.163642] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-079054dc-274b-4aee-b82b-f7b7ae6e524d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2710.167196] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2710.167196] nova-compute[62208]: warnings.warn( [ 2710.178479] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-feddca76-10cd-4682-8c15-b13804a54dd2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2710.180979] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2710.180979] nova-compute[62208]: warnings.warn( [ 2710.185665] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa687ebc-9092-46f4-b08a-a27718dcb854 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2710.188473] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2710.188473] nova-compute[62208]: warnings.warn( [ 2710.214305] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181966MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2710.214439] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2710.214635] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2710.264085] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance e5ed059f-0390-480e-bafe-17092f272131 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2710.264257] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c5f6249c-4435-4aad-99ce-f426c042a24a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2710.264383] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c2db95e8-c625-4c06-bded-237af38df144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2710.264506] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d238f196-f7e6-455b-b514-e8475e204e82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2710.264625] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d5289eeb-c269-431e-9a8e-d27487e12b2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2710.264804] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2710.264940] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2710.335354] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c042594-3155-47af-af87-5b4528d867ce {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2710.337863] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2710.337863] nova-compute[62208]: warnings.warn( [ 2710.343722] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b4b7a02-7168-4213-a769-1791b3f1c88a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2710.346511] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2710.346511] nova-compute[62208]: warnings.warn( [ 2710.373730] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19ba741d-a111-41ed-bd7e-8f18e5453365 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2710.376247] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2710.376247] nova-compute[62208]: warnings.warn( [ 2710.381697] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3754f742-7166-4aaf-b5b3-8813d34cc7ca {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2710.385738] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2710.385738] nova-compute[62208]: warnings.warn( [ 2710.395243] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2710.403960] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2710.420676] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2710.420863] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2720.416581] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2730.215273] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2730.215273] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2730.215273] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2730.215273] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2730.215273] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2730.215273] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2730.215273] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2730.215273] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2730.215273] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2730.215273] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2730.215273] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2730.215273] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2730.216055] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/f800162d-04fc-4c67-8fe8-5dab7d3a8402/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2730.218230] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2730.218570] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Copying Virtual Disk [datastore2] vmware_temp/f800162d-04fc-4c67-8fe8-5dab7d3a8402/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/f800162d-04fc-4c67-8fe8-5dab7d3a8402/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2730.218950] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b9dd92a5-aac7-4b69-b162-dd5661748f8d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2730.223580] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2730.223580] nova-compute[62208]: warnings.warn( [ 2730.229960] nova-compute[62208]: DEBUG oslo_vmware.api [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Waiting for the task: (returnval){ [ 2730.229960] nova-compute[62208]: value = "task-38730" [ 2730.229960] nova-compute[62208]: _type = "Task" [ 2730.229960] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2730.233094] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2730.233094] nova-compute[62208]: warnings.warn( [ 2730.238015] nova-compute[62208]: DEBUG oslo_vmware.api [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Task: {'id': task-38730, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2730.734170] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2730.734170] nova-compute[62208]: warnings.warn( [ 2730.740220] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2730.740506] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2730.741105] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] Traceback (most recent call last): [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] yield resources [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] self.driver.spawn(context, instance, image_meta, [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] self._fetch_image_if_missing(context, vi) [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] image_cache(vi, tmp_image_ds_loc) [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] vm_util.copy_virtual_disk( [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] session._wait_for_task(vmdk_copy_task) [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] return self.wait_for_task(task_ref) [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] return evt.wait() [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] result = hub.switch() [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] return self.greenlet.switch() [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] self.f(*self.args, **self.kw) [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] raise exceptions.translate_fault(task_info.error) [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] Faults: ['InvalidArgument'] [ 2730.741105] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] [ 2730.742146] nova-compute[62208]: INFO nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Terminating instance [ 2730.742989] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2730.743195] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2730.743439] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e74bfcc2-4f3a-4983-8a96-6d8012188444 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2730.745747] nova-compute[62208]: DEBUG nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2730.745944] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2730.746720] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7be6c93-becb-4b68-a71c-c8cd9bab226a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2730.749313] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2730.749313] nova-compute[62208]: warnings.warn( [ 2730.749677] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2730.749677] nova-compute[62208]: warnings.warn( [ 2730.754017] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2730.754243] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e7ba279e-9334-4eb4-a6a6-6ca88b76be2d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2730.756575] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2730.756744] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2730.757310] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2730.757310] nova-compute[62208]: warnings.warn( [ 2730.757729] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0d8c1f82-38f1-40ae-9761-fd46164161dc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2730.759687] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2730.759687] nova-compute[62208]: warnings.warn( [ 2730.762460] nova-compute[62208]: DEBUG oslo_vmware.api [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2730.762460] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]524ce6c4-0eac-df65-8b23-c73fa1f634fb" [ 2730.762460] nova-compute[62208]: _type = "Task" [ 2730.762460] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2730.765363] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2730.765363] nova-compute[62208]: warnings.warn( [ 2730.770017] nova-compute[62208]: DEBUG oslo_vmware.api [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]524ce6c4-0eac-df65-8b23-c73fa1f634fb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2730.824864] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2730.825140] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2730.825285] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Deleting the datastore file [datastore2] e5ed059f-0390-480e-bafe-17092f272131 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2730.825540] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-36209708-4dca-4773-b52f-88561b5b95ad {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2730.828093] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2730.828093] nova-compute[62208]: warnings.warn( [ 2730.833314] nova-compute[62208]: DEBUG oslo_vmware.api [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Waiting for the task: (returnval){ [ 2730.833314] nova-compute[62208]: value = "task-38732" [ 2730.833314] nova-compute[62208]: _type = "Task" [ 2730.833314] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2730.836246] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2730.836246] nova-compute[62208]: warnings.warn( [ 2730.841370] nova-compute[62208]: DEBUG oslo_vmware.api [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Task: {'id': task-38732, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2731.267012] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2731.267012] nova-compute[62208]: warnings.warn( [ 2731.273115] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2731.273372] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating directory with path [datastore2] vmware_temp/9e04b70b-abf1-43d0-8dd3-67416fa1400c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2731.273610] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-230b1ce2-7707-4e53-b0da-902ae848191a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2731.275445] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2731.275445] nova-compute[62208]: warnings.warn( [ 2731.285390] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created directory with path [datastore2] vmware_temp/9e04b70b-abf1-43d0-8dd3-67416fa1400c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2731.285576] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Fetch image to [datastore2] vmware_temp/9e04b70b-abf1-43d0-8dd3-67416fa1400c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2731.285744] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/9e04b70b-abf1-43d0-8dd3-67416fa1400c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2731.286530] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d987b7e-9afa-448b-b18d-a98f3bf63e7b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2731.288836] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2731.288836] nova-compute[62208]: warnings.warn( [ 2731.293136] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6072339-db0c-4608-879f-d360d1cc9f8e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2731.295260] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2731.295260] nova-compute[62208]: warnings.warn( [ 2731.302386] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44f4defb-dbd8-406c-aa18-bec5b67b61b7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2731.305917] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2731.305917] nova-compute[62208]: warnings.warn( [ 2731.332245] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79149a39-27c2-40aa-972c-ac355be71b7c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2731.338339] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2731.338339] nova-compute[62208]: warnings.warn( [ 2731.338783] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2731.338783] nova-compute[62208]: warnings.warn( [ 2731.345241] nova-compute[62208]: DEBUG oslo_vmware.api [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Task: {'id': task-38732, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.091244} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2731.345458] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-00c3972f-4b01-4972-96d3-18ac635d9351 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2731.347188] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2731.347380] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2731.347556] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2731.347808] nova-compute[62208]: INFO nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2731.349200] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2731.349200] nova-compute[62208]: warnings.warn( [ 2731.350036] nova-compute[62208]: DEBUG nova.compute.claims [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936d5aa10> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2731.350219] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2731.350435] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2731.370182] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2731.423621] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9e04b70b-abf1-43d0-8dd3-67416fa1400c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2731.480201] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2731.480393] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9e04b70b-abf1-43d0-8dd3-67416fa1400c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2731.516765] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6f187fa-d63e-4ecc-a580-845921ed1f9a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2731.519360] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2731.519360] nova-compute[62208]: warnings.warn( [ 2731.524264] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dc4bbf7-0d79-4b21-9d76-5e649cb11e7d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2731.527674] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2731.527674] nova-compute[62208]: warnings.warn( [ 2731.554625] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-305b8a36-4729-44c9-80d4-6ecf03174603 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2731.556841] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2731.556841] nova-compute[62208]: warnings.warn( [ 2731.562153] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-160f3f37-7c43-4425-ab24-88ffc386b609 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2731.565791] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2731.565791] nova-compute[62208]: warnings.warn( [ 2731.576017] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2731.584692] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2731.601649] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.251s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2731.602186] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] Traceback (most recent call last): [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] self.driver.spawn(context, instance, image_meta, [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] self._fetch_image_if_missing(context, vi) [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] image_cache(vi, tmp_image_ds_loc) [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] vm_util.copy_virtual_disk( [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] session._wait_for_task(vmdk_copy_task) [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] return self.wait_for_task(task_ref) [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] return evt.wait() [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] result = hub.switch() [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] return self.greenlet.switch() [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] self.f(*self.args, **self.kw) [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] raise exceptions.translate_fault(task_info.error) [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] Faults: ['InvalidArgument'] [ 2731.602186] nova-compute[62208]: ERROR nova.compute.manager [instance: e5ed059f-0390-480e-bafe-17092f272131] [ 2731.603311] nova-compute[62208]: DEBUG nova.compute.utils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2731.604349] nova-compute[62208]: DEBUG nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Build of instance e5ed059f-0390-480e-bafe-17092f272131 was re-scheduled: A specified parameter was not correct: fileType [ 2731.604349] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2731.604714] nova-compute[62208]: DEBUG nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2731.604890] nova-compute[62208]: DEBUG nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2731.605059] nova-compute[62208]: DEBUG nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2731.605222] nova-compute[62208]: DEBUG nova.network.neutron [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2731.888266] nova-compute[62208]: DEBUG nova.network.neutron [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2731.902271] nova-compute[62208]: INFO nova.compute.manager [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Took 0.30 seconds to deallocate network for instance. [ 2732.001824] nova-compute[62208]: INFO nova.scheduler.client.report [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Deleted allocations for instance e5ed059f-0390-480e-bafe-17092f272131 [ 2732.029561] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-81a1e20d-e1d2-435e-acc6-26bd251b334f tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Lock "e5ed059f-0390-480e-bafe-17092f272131" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 341.946s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2732.029927] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "e5ed059f-0390-480e-bafe-17092f272131" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 327.632s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2732.030109] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: e5ed059f-0390-480e-bafe-17092f272131] During sync_power_state the instance has a pending task (spawning). Skip. [ 2732.030302] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "e5ed059f-0390-480e-bafe-17092f272131" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2732.030885] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2ecf536-092c-483b-94aa-271a8aeddf87 tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Lock "e5ed059f-0390-480e-bafe-17092f272131" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 146.098s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2732.031119] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2ecf536-092c-483b-94aa-271a8aeddf87 tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Acquiring lock "e5ed059f-0390-480e-bafe-17092f272131-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2732.031327] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2ecf536-092c-483b-94aa-271a8aeddf87 tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Lock "e5ed059f-0390-480e-bafe-17092f272131-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2732.031492] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2ecf536-092c-483b-94aa-271a8aeddf87 tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Lock "e5ed059f-0390-480e-bafe-17092f272131-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2732.033515] nova-compute[62208]: INFO nova.compute.manager [None req-e2ecf536-092c-483b-94aa-271a8aeddf87 tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Terminating instance [ 2732.035358] nova-compute[62208]: DEBUG nova.compute.manager [None req-e2ecf536-092c-483b-94aa-271a8aeddf87 tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2732.035488] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2ecf536-092c-483b-94aa-271a8aeddf87 tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2732.036625] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9212b1bd-74f1-4d24-a264-d5afe6cd2a6a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2732.040941] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2732.040941] nova-compute[62208]: warnings.warn( [ 2732.047854] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e08f3661-1104-48ce-a639-b2b3c26c08cd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2732.058562] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2732.058562] nova-compute[62208]: warnings.warn( [ 2732.074035] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-e2ecf536-092c-483b-94aa-271a8aeddf87 tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e5ed059f-0390-480e-bafe-17092f272131 could not be found. [ 2732.074035] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-e2ecf536-092c-483b-94aa-271a8aeddf87 tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2732.074035] nova-compute[62208]: INFO nova.compute.manager [None req-e2ecf536-092c-483b-94aa-271a8aeddf87 tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] [instance: e5ed059f-0390-480e-bafe-17092f272131] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2732.074245] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-e2ecf536-092c-483b-94aa-271a8aeddf87 tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2732.074401] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: e5ed059f-0390-480e-bafe-17092f272131] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2732.074498] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: e5ed059f-0390-480e-bafe-17092f272131] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2732.102213] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: e5ed059f-0390-480e-bafe-17092f272131] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2732.109857] nova-compute[62208]: INFO nova.compute.manager [-] [instance: e5ed059f-0390-480e-bafe-17092f272131] Took 0.04 seconds to deallocate network for instance. [ 2732.198711] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-e2ecf536-092c-483b-94aa-271a8aeddf87 tempest-ServersNegativeTestMultiTenantJSON-343759983 tempest-ServersNegativeTestMultiTenantJSON-343759983-project-member] Lock "e5ed059f-0390-480e-bafe-17092f272131" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.168s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2758.143842] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2761.141297] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2761.141669] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2761.141669] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2761.155623] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2761.155799] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c2db95e8-c625-4c06-bded-237af38df144] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2761.155899] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2761.156049] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2761.156178] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2761.156674] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2766.151839] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2767.141852] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2769.143604] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2770.141986] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2770.142285] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2771.141137] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2772.140726] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2772.150889] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2772.151179] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2772.151277] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2772.151430] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2772.152612] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-660e7c91-dae1-4dc8-9a28-20d0380f9596 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2772.155472] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2772.155472] nova-compute[62208]: warnings.warn( [ 2772.161787] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12ae4df9-fb72-45a0-945d-7630531e257f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2772.165396] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2772.165396] nova-compute[62208]: warnings.warn( [ 2772.176968] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09a7fd03-29fa-4441-b0e7-2b5cb7d0b88c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2772.179298] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2772.179298] nova-compute[62208]: warnings.warn( [ 2772.183464] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-206e1100-c045-4a64-a225-e3fd8a29721c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2772.186125] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2772.186125] nova-compute[62208]: warnings.warn( [ 2772.211537] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181903MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2772.211706] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2772.211871] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2772.258170] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c5f6249c-4435-4aad-99ce-f426c042a24a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2772.258325] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance c2db95e8-c625-4c06-bded-237af38df144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2772.258487] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d238f196-f7e6-455b-b514-e8475e204e82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2772.258624] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d5289eeb-c269-431e-9a8e-d27487e12b2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2772.258802] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2772.258937] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2772.317027] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f35bd2b4-6bab-4660-be4c-d0ccf243f7de {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2772.319484] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2772.319484] nova-compute[62208]: warnings.warn( [ 2772.324515] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-653af2da-2918-4135-b06a-325e11304e3c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2772.327559] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2772.327559] nova-compute[62208]: warnings.warn( [ 2772.353146] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98c9fe5e-5dd4-4ac8-81d9-be7f1828da90 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2772.355436] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2772.355436] nova-compute[62208]: warnings.warn( [ 2772.360669] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0c31318-6b16-4850-bfe7-473b1cacfdfa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2772.365617] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2772.365617] nova-compute[62208]: warnings.warn( [ 2772.375214] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2772.383958] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2772.400134] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2772.400332] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.188s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2780.230516] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2780.230516] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2780.230516] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2780.230516] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2780.230516] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2780.230516] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2780.230516] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2780.230516] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2780.230516] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2780.230516] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2780.230516] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2780.230516] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2780.230516] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/9e04b70b-abf1-43d0-8dd3-67416fa1400c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2780.232557] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2780.232830] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Copying Virtual Disk [datastore2] vmware_temp/9e04b70b-abf1-43d0-8dd3-67416fa1400c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/9e04b70b-abf1-43d0-8dd3-67416fa1400c/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2780.233132] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1a19acd3-52de-4d97-bb61-3d7611843838 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2780.235509] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2780.235509] nova-compute[62208]: warnings.warn( [ 2780.241754] nova-compute[62208]: DEBUG oslo_vmware.api [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2780.241754] nova-compute[62208]: value = "task-38733" [ 2780.241754] nova-compute[62208]: _type = "Task" [ 2780.241754] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2780.245021] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2780.245021] nova-compute[62208]: warnings.warn( [ 2780.250361] nova-compute[62208]: DEBUG oslo_vmware.api [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38733, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2780.746001] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2780.746001] nova-compute[62208]: warnings.warn( [ 2780.752636] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2780.752929] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2780.753492] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Traceback (most recent call last): [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] yield resources [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] self.driver.spawn(context, instance, image_meta, [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] self._fetch_image_if_missing(context, vi) [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] image_cache(vi, tmp_image_ds_loc) [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] vm_util.copy_virtual_disk( [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] session._wait_for_task(vmdk_copy_task) [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] return self.wait_for_task(task_ref) [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] return evt.wait() [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] result = hub.switch() [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] return self.greenlet.switch() [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] self.f(*self.args, **self.kw) [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] raise exceptions.translate_fault(task_info.error) [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Faults: ['InvalidArgument'] [ 2780.753492] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] [ 2780.754602] nova-compute[62208]: INFO nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Terminating instance [ 2780.755364] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2780.755575] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2780.755820] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dd0d863d-d117-49fa-9598-fb0a85010a2a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2780.758215] nova-compute[62208]: DEBUG nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2780.758412] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2780.759396] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0b8cbf5-9993-47d9-ae95-e52948417f64 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2780.762737] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2780.762737] nova-compute[62208]: warnings.warn( [ 2780.763303] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2780.763303] nova-compute[62208]: warnings.warn( [ 2780.767916] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2780.768188] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-606cc7dc-a5dd-4236-89ec-1cd0906d8718 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2780.770739] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2780.770914] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2780.771531] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2780.771531] nova-compute[62208]: warnings.warn( [ 2780.771995] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-49bf03fe-b04f-4d79-ab5c-501085779a95 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2780.774147] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2780.774147] nova-compute[62208]: warnings.warn( [ 2780.777768] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Waiting for the task: (returnval){ [ 2780.777768] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52a21f97-aa78-e73a-7a1d-a74c0afdd940" [ 2780.777768] nova-compute[62208]: _type = "Task" [ 2780.777768] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2780.781735] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2780.781735] nova-compute[62208]: warnings.warn( [ 2780.787786] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52a21f97-aa78-e73a-7a1d-a74c0afdd940, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2780.844325] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2780.844594] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2780.844741] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleting the datastore file [datastore2] c5f6249c-4435-4aad-99ce-f426c042a24a {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2780.845023] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-810deda1-d76f-49d4-a15e-959c29cea4b7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2780.847082] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2780.847082] nova-compute[62208]: warnings.warn( [ 2780.852453] nova-compute[62208]: DEBUG oslo_vmware.api [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2780.852453] nova-compute[62208]: value = "task-38735" [ 2780.852453] nova-compute[62208]: _type = "Task" [ 2780.852453] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2780.855745] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2780.855745] nova-compute[62208]: warnings.warn( [ 2780.861474] nova-compute[62208]: DEBUG oslo_vmware.api [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38735, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2781.282414] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2781.282414] nova-compute[62208]: warnings.warn( [ 2781.288750] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2781.289005] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Creating directory with path [datastore2] vmware_temp/2f1f84f0-9608-45a7-8f92-1089b1415f77/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2781.289339] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-66230b10-5cf3-48cd-965f-5bfc8595dc87 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2781.291317] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2781.291317] nova-compute[62208]: warnings.warn( [ 2781.302221] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Created directory with path [datastore2] vmware_temp/2f1f84f0-9608-45a7-8f92-1089b1415f77/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2781.302443] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Fetch image to [datastore2] vmware_temp/2f1f84f0-9608-45a7-8f92-1089b1415f77/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2781.302583] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/2f1f84f0-9608-45a7-8f92-1089b1415f77/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2781.303354] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-988cf0b1-da0b-4ac9-a85a-a237d46e5d09 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2781.305797] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2781.305797] nova-compute[62208]: warnings.warn( [ 2781.310614] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3eac0350-516c-44d6-98e7-dce901a7d67f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2781.312918] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2781.312918] nova-compute[62208]: warnings.warn( [ 2781.320624] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e74d4ba1-caed-4ff8-9ce4-71a209073eb0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2781.324299] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2781.324299] nova-compute[62208]: warnings.warn( [ 2781.353612] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-728654e4-d8a8-4536-b2c7-f369a3f74960 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2781.358753] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2781.358753] nova-compute[62208]: warnings.warn( [ 2781.359317] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2781.359317] nova-compute[62208]: warnings.warn( [ 2781.365602] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-651984a1-d7a1-4c25-b7d7-3f6f38393ef5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2781.367494] nova-compute[62208]: DEBUG oslo_vmware.api [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38735, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076959} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2781.367773] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2781.367954] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2781.368156] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2781.368353] nova-compute[62208]: INFO nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2781.369838] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2781.369838] nova-compute[62208]: warnings.warn( [ 2781.370668] nova-compute[62208]: DEBUG nova.compute.claims [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936ce0130> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2781.370847] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2781.371100] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2781.391595] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2781.449839] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2f1f84f0-9608-45a7-8f92-1089b1415f77/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2781.508276] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2781.508495] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2f1f84f0-9608-45a7-8f92-1089b1415f77/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2781.541561] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90c22cdf-a151-471c-be52-7e451b5fba78 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2781.544177] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2781.544177] nova-compute[62208]: warnings.warn( [ 2781.549304] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12a26b41-9a01-4cdb-b76c-49450252f918 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2781.552621] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2781.552621] nova-compute[62208]: warnings.warn( [ 2781.580123] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f8762dd-0752-49e7-aa6f-ff5a0a63d40c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2781.582635] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2781.582635] nova-compute[62208]: warnings.warn( [ 2781.588330] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fe8266b-39da-4107-a416-e05bd47db848 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2781.592590] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2781.592590] nova-compute[62208]: warnings.warn( [ 2781.603245] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2781.611978] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2781.629296] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.258s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2781.629975] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Traceback (most recent call last): [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] self.driver.spawn(context, instance, image_meta, [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] self._fetch_image_if_missing(context, vi) [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] image_cache(vi, tmp_image_ds_loc) [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] vm_util.copy_virtual_disk( [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] session._wait_for_task(vmdk_copy_task) [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] return self.wait_for_task(task_ref) [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] return evt.wait() [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] result = hub.switch() [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] return self.greenlet.switch() [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] self.f(*self.args, **self.kw) [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] raise exceptions.translate_fault(task_info.error) [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Faults: ['InvalidArgument'] [ 2781.629975] nova-compute[62208]: ERROR nova.compute.manager [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] [ 2781.631111] nova-compute[62208]: DEBUG nova.compute.utils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2781.632600] nova-compute[62208]: DEBUG nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Build of instance c5f6249c-4435-4aad-99ce-f426c042a24a was re-scheduled: A specified parameter was not correct: fileType [ 2781.632600] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2781.633126] nova-compute[62208]: DEBUG nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2781.633358] nova-compute[62208]: DEBUG nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2781.633604] nova-compute[62208]: DEBUG nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2781.633853] nova-compute[62208]: DEBUG nova.network.neutron [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2781.952471] nova-compute[62208]: DEBUG nova.network.neutron [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2781.970070] nova-compute[62208]: INFO nova.compute.manager [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Took 0.34 seconds to deallocate network for instance. [ 2782.072636] nova-compute[62208]: INFO nova.scheduler.client.report [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleted allocations for instance c5f6249c-4435-4aad-99ce-f426c042a24a [ 2782.093747] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-14b03297-e0a3-4eb9-a54e-dfade91c4393 tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "c5f6249c-4435-4aad-99ce-f426c042a24a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 386.943s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2782.094018] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "c5f6249c-4435-4aad-99ce-f426c042a24a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 377.696s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2782.094214] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] During sync_power_state the instance has a pending task (spawning). Skip. [ 2782.094398] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "c5f6249c-4435-4aad-99ce-f426c042a24a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2782.095234] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-04e13554-074f-4be1-a8aa-34fd7e3755ae tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "c5f6249c-4435-4aad-99ce-f426c042a24a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 191.119s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2782.095234] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-04e13554-074f-4be1-a8aa-34fd7e3755ae tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "c5f6249c-4435-4aad-99ce-f426c042a24a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2782.095577] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-04e13554-074f-4be1-a8aa-34fd7e3755ae tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "c5f6249c-4435-4aad-99ce-f426c042a24a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2782.095577] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-04e13554-074f-4be1-a8aa-34fd7e3755ae tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "c5f6249c-4435-4aad-99ce-f426c042a24a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2782.098153] nova-compute[62208]: INFO nova.compute.manager [None req-04e13554-074f-4be1-a8aa-34fd7e3755ae tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Terminating instance [ 2782.101526] nova-compute[62208]: DEBUG nova.compute.manager [None req-04e13554-074f-4be1-a8aa-34fd7e3755ae tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2782.101526] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-04e13554-074f-4be1-a8aa-34fd7e3755ae tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2782.101526] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-630faba1-9756-4204-99ae-39f24df7086a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2782.103426] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2782.103426] nova-compute[62208]: warnings.warn( [ 2782.110949] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ecb8a20-0f27-4b06-863f-9f3593fad675 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2782.122166] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2782.122166] nova-compute[62208]: warnings.warn( [ 2782.137729] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-04e13554-074f-4be1-a8aa-34fd7e3755ae tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c5f6249c-4435-4aad-99ce-f426c042a24a could not be found. [ 2782.138235] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-04e13554-074f-4be1-a8aa-34fd7e3755ae tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2782.138504] nova-compute[62208]: INFO nova.compute.manager [None req-04e13554-074f-4be1-a8aa-34fd7e3755ae tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2782.138761] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-04e13554-074f-4be1-a8aa-34fd7e3755ae tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2782.139581] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2782.139928] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2782.169544] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2782.178542] nova-compute[62208]: INFO nova.compute.manager [-] [instance: c5f6249c-4435-4aad-99ce-f426c042a24a] Took 0.04 seconds to deallocate network for instance. [ 2782.283389] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-04e13554-074f-4be1-a8aa-34fd7e3755ae tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "c5f6249c-4435-4aad-99ce-f426c042a24a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.188s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2783.494732] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "4d141090-57cf-442a-a03e-6151d29f2266" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2783.495094] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "4d141090-57cf-442a-a03e-6151d29f2266" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2783.506496] nova-compute[62208]: DEBUG nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 2783.556525] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2783.556782] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2783.558210] nova-compute[62208]: INFO nova.compute.claims [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2783.660502] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6cea05a-a9e2-4e41-9e7a-cea228a4ea81 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2783.662892] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2783.662892] nova-compute[62208]: warnings.warn( [ 2784.472216] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f484ab67-3588-469f-9b15-fbf33519f0b4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2784.475240] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2784.475240] nova-compute[62208]: warnings.warn( [ 2784.503115] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51d4798b-c8ad-4985-82f2-1801c9388289 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2784.505537] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2784.505537] nova-compute[62208]: warnings.warn( [ 2784.511531] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd6b0a6b-5d52-492e-ad45-0556c211562b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2784.515356] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2784.515356] nova-compute[62208]: warnings.warn( [ 2784.525023] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2784.536022] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2784.550126] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.993s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2784.550638] nova-compute[62208]: DEBUG nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 2784.586317] nova-compute[62208]: DEBUG nova.compute.utils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2784.587548] nova-compute[62208]: DEBUG nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2784.587725] nova-compute[62208]: DEBUG nova.network.neutron [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2784.603069] nova-compute[62208]: DEBUG nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 2784.632816] nova-compute[62208]: DEBUG nova.policy [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e8cb00a6413b46fcb17cbe532a0bffc5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '53b578fa6aa34a2d80eb9938d58ffe12', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 2784.672392] nova-compute[62208]: DEBUG nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 2784.694146] nova-compute[62208]: DEBUG nova.virt.hardware [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2784.694374] nova-compute[62208]: DEBUG nova.virt.hardware [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2784.694536] nova-compute[62208]: DEBUG nova.virt.hardware [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2784.694722] nova-compute[62208]: DEBUG nova.virt.hardware [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2784.694870] nova-compute[62208]: DEBUG nova.virt.hardware [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2784.695103] nova-compute[62208]: DEBUG nova.virt.hardware [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2784.695223] nova-compute[62208]: DEBUG nova.virt.hardware [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2784.695382] nova-compute[62208]: DEBUG nova.virt.hardware [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2784.695551] nova-compute[62208]: DEBUG nova.virt.hardware [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2784.695738] nova-compute[62208]: DEBUG nova.virt.hardware [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2784.695954] nova-compute[62208]: DEBUG nova.virt.hardware [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2784.696841] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2ddae9f-93e4-481c-a907-88e2d253e49f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2784.700027] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2784.700027] nova-compute[62208]: warnings.warn( [ 2784.705958] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a87f955-183a-4684-9593-532c8eff82ed {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2784.710339] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2784.710339] nova-compute[62208]: warnings.warn( [ 2784.879252] nova-compute[62208]: DEBUG nova.network.neutron [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Successfully created port: 6b93cfb0-3c63-46f8-95a9-5d7a45a02d13 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2785.381676] nova-compute[62208]: DEBUG nova.compute.manager [req-c4c57314-6f40-4606-9215-bf30651eaac8 req-579b279b-3e65-4f0c-b65c-15f84b2f7949 service nova] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Received event network-vif-plugged-6b93cfb0-3c63-46f8-95a9-5d7a45a02d13 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2785.381905] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-c4c57314-6f40-4606-9215-bf30651eaac8 req-579b279b-3e65-4f0c-b65c-15f84b2f7949 service nova] Acquiring lock "4d141090-57cf-442a-a03e-6151d29f2266-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2785.382135] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-c4c57314-6f40-4606-9215-bf30651eaac8 req-579b279b-3e65-4f0c-b65c-15f84b2f7949 service nova] Lock "4d141090-57cf-442a-a03e-6151d29f2266-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2785.382273] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-c4c57314-6f40-4606-9215-bf30651eaac8 req-579b279b-3e65-4f0c-b65c-15f84b2f7949 service nova] Lock "4d141090-57cf-442a-a03e-6151d29f2266-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2785.382434] nova-compute[62208]: DEBUG nova.compute.manager [req-c4c57314-6f40-4606-9215-bf30651eaac8 req-579b279b-3e65-4f0c-b65c-15f84b2f7949 service nova] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] No waiting events found dispatching network-vif-plugged-6b93cfb0-3c63-46f8-95a9-5d7a45a02d13 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2785.382646] nova-compute[62208]: WARNING nova.compute.manager [req-c4c57314-6f40-4606-9215-bf30651eaac8 req-579b279b-3e65-4f0c-b65c-15f84b2f7949 service nova] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Received unexpected event network-vif-plugged-6b93cfb0-3c63-46f8-95a9-5d7a45a02d13 for instance with vm_state building and task_state spawning. [ 2785.490276] nova-compute[62208]: DEBUG nova.network.neutron [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Successfully updated port: 6b93cfb0-3c63-46f8-95a9-5d7a45a02d13 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2785.501503] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "refresh_cache-4d141090-57cf-442a-a03e-6151d29f2266" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2785.501503] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "refresh_cache-4d141090-57cf-442a-a03e-6151d29f2266" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2785.501503] nova-compute[62208]: DEBUG nova.network.neutron [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2785.507486] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8dffaec2-0620-4da5-a065-7d27eea93f54 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "c2db95e8-c625-4c06-bded-237af38df144" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2785.545832] nova-compute[62208]: DEBUG nova.network.neutron [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2785.693621] nova-compute[62208]: DEBUG nova.network.neutron [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Updating instance_info_cache with network_info: [{"id": "6b93cfb0-3c63-46f8-95a9-5d7a45a02d13", "address": "fa:16:3e:53:e9:b6", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b93cfb0-3c", "ovs_interfaceid": "6b93cfb0-3c63-46f8-95a9-5d7a45a02d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2785.708010] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "refresh_cache-4d141090-57cf-442a-a03e-6151d29f2266" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2785.708359] nova-compute[62208]: DEBUG nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Instance network_info: |[{"id": "6b93cfb0-3c63-46f8-95a9-5d7a45a02d13", "address": "fa:16:3e:53:e9:b6", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b93cfb0-3c", "ovs_interfaceid": "6b93cfb0-3c63-46f8-95a9-5d7a45a02d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 2785.709118] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:53:e9:b6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4b5f9472-1844-4c99-8804-8f193cfff562', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6b93cfb0-3c63-46f8-95a9-5d7a45a02d13', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2785.716917] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2785.717578] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2785.717874] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-857221e0-5ca9-4cd5-b432-c81e52ace979 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2785.733394] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2785.733394] nova-compute[62208]: warnings.warn( [ 2785.739736] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2785.739736] nova-compute[62208]: value = "task-38736" [ 2785.739736] nova-compute[62208]: _type = "Task" [ 2785.739736] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2785.743272] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2785.743272] nova-compute[62208]: warnings.warn( [ 2785.748317] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38736, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2786.243973] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2786.243973] nova-compute[62208]: warnings.warn( [ 2786.249885] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38736, 'name': CreateVM_Task, 'duration_secs': 0.313821} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2786.250059] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2786.250636] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2786.250859] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2786.253618] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b68cf9f7-6463-4ec4-a0dd-33f758bf7a5a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2786.263668] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2786.263668] nova-compute[62208]: warnings.warn( [ 2786.282980] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Reconfiguring VM instance to enable vnc on port - 5903 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2786.283432] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-c4efa7eb-4c6c-4053-9dad-d6e89b020d8c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2786.294176] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2786.294176] nova-compute[62208]: warnings.warn( [ 2786.300027] nova-compute[62208]: DEBUG oslo_vmware.api [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2786.300027] nova-compute[62208]: value = "task-38737" [ 2786.300027] nova-compute[62208]: _type = "Task" [ 2786.300027] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2786.303055] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2786.303055] nova-compute[62208]: warnings.warn( [ 2786.308010] nova-compute[62208]: DEBUG oslo_vmware.api [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38737, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2786.803975] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2786.803975] nova-compute[62208]: warnings.warn( [ 2786.810128] nova-compute[62208]: DEBUG oslo_vmware.api [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38737, 'name': ReconfigVM_Task, 'duration_secs': 0.118028} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2786.810395] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Reconfigured VM instance to enable vnc on port - 5903 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2786.810605] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.560s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2786.810855] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2786.810995] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2786.811319] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2786.811573] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eb249407-6694-481e-8f07-e22f76d0d38d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2786.813081] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2786.813081] nova-compute[62208]: warnings.warn( [ 2786.816117] nova-compute[62208]: DEBUG oslo_vmware.api [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2786.816117] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52044436-785c-cb79-4e8a-2cf46f54b84c" [ 2786.816117] nova-compute[62208]: _type = "Task" [ 2786.816117] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2786.819095] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2786.819095] nova-compute[62208]: warnings.warn( [ 2786.828828] nova-compute[62208]: DEBUG oslo_vmware.api [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52044436-785c-cb79-4e8a-2cf46f54b84c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2787.320727] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2787.320727] nova-compute[62208]: warnings.warn( [ 2787.327206] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2787.327497] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2787.327710] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2787.410921] nova-compute[62208]: DEBUG nova.compute.manager [req-f2931e62-520e-46b5-a58e-861c014429ea req-a859e47c-58f4-4504-9ac6-8281ffd13a3f service nova] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Received event network-changed-6b93cfb0-3c63-46f8-95a9-5d7a45a02d13 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2787.411077] nova-compute[62208]: DEBUG nova.compute.manager [req-f2931e62-520e-46b5-a58e-861c014429ea req-a859e47c-58f4-4504-9ac6-8281ffd13a3f service nova] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Refreshing instance network info cache due to event network-changed-6b93cfb0-3c63-46f8-95a9-5d7a45a02d13. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 2787.411290] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f2931e62-520e-46b5-a58e-861c014429ea req-a859e47c-58f4-4504-9ac6-8281ffd13a3f service nova] Acquiring lock "refresh_cache-4d141090-57cf-442a-a03e-6151d29f2266" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2787.411429] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f2931e62-520e-46b5-a58e-861c014429ea req-a859e47c-58f4-4504-9ac6-8281ffd13a3f service nova] Acquired lock "refresh_cache-4d141090-57cf-442a-a03e-6151d29f2266" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2787.411628] nova-compute[62208]: DEBUG nova.network.neutron [req-f2931e62-520e-46b5-a58e-861c014429ea req-a859e47c-58f4-4504-9ac6-8281ffd13a3f service nova] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Refreshing network info cache for port 6b93cfb0-3c63-46f8-95a9-5d7a45a02d13 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2787.626004] nova-compute[62208]: DEBUG nova.network.neutron [req-f2931e62-520e-46b5-a58e-861c014429ea req-a859e47c-58f4-4504-9ac6-8281ffd13a3f service nova] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Updated VIF entry in instance network info cache for port 6b93cfb0-3c63-46f8-95a9-5d7a45a02d13. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2787.626377] nova-compute[62208]: DEBUG nova.network.neutron [req-f2931e62-520e-46b5-a58e-861c014429ea req-a859e47c-58f4-4504-9ac6-8281ffd13a3f service nova] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Updating instance_info_cache with network_info: [{"id": "6b93cfb0-3c63-46f8-95a9-5d7a45a02d13", "address": "fa:16:3e:53:e9:b6", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b93cfb0-3c", "ovs_interfaceid": "6b93cfb0-3c63-46f8-95a9-5d7a45a02d13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2787.635539] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f2931e62-520e-46b5-a58e-861c014429ea req-a859e47c-58f4-4504-9ac6-8281ffd13a3f service nova] Releasing lock "refresh_cache-4d141090-57cf-442a-a03e-6151d29f2266" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2819.401978] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2821.141068] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2821.141514] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2821.141514] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2821.155325] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: c2db95e8-c625-4c06-bded-237af38df144] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2821.155569] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2821.155647] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2821.155716] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2821.155836] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2822.140492] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2826.136637] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2826.584975] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2826.584975] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2826.584975] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2826.584975] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2826.584975] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2826.584975] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2826.584975] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2826.584975] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2826.584975] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2826.584975] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2826.584975] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2826.584975] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2826.585561] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/2f1f84f0-9608-45a7-8f92-1089b1415f77/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2826.587793] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2826.588042] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Copying Virtual Disk [datastore2] vmware_temp/2f1f84f0-9608-45a7-8f92-1089b1415f77/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/2f1f84f0-9608-45a7-8f92-1089b1415f77/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2826.588375] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-06a4fd14-f93b-4807-b9db-00d3c02d4265 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2826.591207] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2826.591207] nova-compute[62208]: warnings.warn( [ 2826.597668] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Waiting for the task: (returnval){ [ 2826.597668] nova-compute[62208]: value = "task-38738" [ 2826.597668] nova-compute[62208]: _type = "Task" [ 2826.597668] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2826.600720] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2826.600720] nova-compute[62208]: warnings.warn( [ 2826.605387] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': task-38738, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2827.102512] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.102512] nova-compute[62208]: warnings.warn( [ 2827.108552] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2827.108836] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2827.109384] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] Traceback (most recent call last): [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] yield resources [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] self.driver.spawn(context, instance, image_meta, [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] self._fetch_image_if_missing(context, vi) [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] image_cache(vi, tmp_image_ds_loc) [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] vm_util.copy_virtual_disk( [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] session._wait_for_task(vmdk_copy_task) [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] return self.wait_for_task(task_ref) [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] return evt.wait() [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] result = hub.switch() [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] return self.greenlet.switch() [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] self.f(*self.args, **self.kw) [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] raise exceptions.translate_fault(task_info.error) [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] Faults: ['InvalidArgument'] [ 2827.109384] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] [ 2827.110650] nova-compute[62208]: INFO nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Terminating instance [ 2827.111536] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2827.111536] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2827.111723] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f44fc0f6-7494-4acb-b252-90b0021544df {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.113831] nova-compute[62208]: DEBUG nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2827.114020] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2827.114878] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0898f498-22cf-46d8-89fd-a6938875c32d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.117385] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.117385] nova-compute[62208]: warnings.warn( [ 2827.117727] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.117727] nova-compute[62208]: warnings.warn( [ 2827.122155] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2827.122381] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4d9edc36-e10d-4d69-8cb2-a452ac8a52ba {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.124486] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2827.124660] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2827.125207] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.125207] nova-compute[62208]: warnings.warn( [ 2827.125676] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ed2a6668-747a-49dd-ac67-09e1a7bebf76 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.127582] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.127582] nova-compute[62208]: warnings.warn( [ 2827.130580] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Waiting for the task: (returnval){ [ 2827.130580] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]527c09f7-faaf-b15c-249c-4455fa0ac1bb" [ 2827.130580] nova-compute[62208]: _type = "Task" [ 2827.130580] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2827.133497] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.133497] nova-compute[62208]: warnings.warn( [ 2827.138112] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]527c09f7-faaf-b15c-249c-4455fa0ac1bb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2827.191194] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2827.191406] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2827.191590] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Deleting the datastore file [datastore2] c2db95e8-c625-4c06-bded-237af38df144 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2827.191873] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-afaf91f2-1a60-4482-acf8-dd72b7b7c330 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.193702] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.193702] nova-compute[62208]: warnings.warn( [ 2827.199038] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Waiting for the task: (returnval){ [ 2827.199038] nova-compute[62208]: value = "task-38740" [ 2827.199038] nova-compute[62208]: _type = "Task" [ 2827.199038] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2827.202199] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.202199] nova-compute[62208]: warnings.warn( [ 2827.208526] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': task-38740, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2827.635516] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.635516] nova-compute[62208]: warnings.warn( [ 2827.642077] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2827.642566] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Creating directory with path [datastore2] vmware_temp/5db6c166-95f6-45bb-9d67-f8df2d292333/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2827.642937] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6f7c255e-b3ee-4890-9735-965a5e5acc4f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.644966] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.644966] nova-compute[62208]: warnings.warn( [ 2827.655038] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Created directory with path [datastore2] vmware_temp/5db6c166-95f6-45bb-9d67-f8df2d292333/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2827.655417] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Fetch image to [datastore2] vmware_temp/5db6c166-95f6-45bb-9d67-f8df2d292333/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2827.655763] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/5db6c166-95f6-45bb-9d67-f8df2d292333/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2827.656716] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db1edc63-2c9d-4abd-aafe-94d0039a42f3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.659226] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.659226] nova-compute[62208]: warnings.warn( [ 2827.664222] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0edd6a24-668f-4508-a656-658a349c3a9b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.666721] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.666721] nova-compute[62208]: warnings.warn( [ 2827.674113] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1659b4d-ada4-4571-856b-4f9286e019e8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.677963] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.677963] nova-compute[62208]: warnings.warn( [ 2827.708658] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0fd79e0-b234-4c33-81d4-603ba2b95f6f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.711104] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.711104] nova-compute[62208]: warnings.warn( [ 2827.711653] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.711653] nova-compute[62208]: warnings.warn( [ 2827.716611] nova-compute[62208]: DEBUG oslo_vmware.api [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': task-38740, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074401} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2827.718220] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2827.718594] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2827.718910] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2827.719213] nova-compute[62208]: INFO nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2827.721426] nova-compute[62208]: DEBUG nova.compute.claims [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936535870> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2827.721736] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2827.722067] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2827.724865] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-29e2a1ca-c7dc-47c8-8898-9c081fb9b6cd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.726723] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.726723] nova-compute[62208]: warnings.warn( [ 2827.745548] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2827.802380] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5db6c166-95f6-45bb-9d67-f8df2d292333/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2827.861208] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2827.861430] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5db6c166-95f6-45bb-9d67-f8df2d292333/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2827.870812] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbdad39c-8e51-477d-9692-a94aec40ab90 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.873701] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.873701] nova-compute[62208]: warnings.warn( [ 2827.878810] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3bcd579-cd52-4833-b057-04bd7f610df5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.881973] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.881973] nova-compute[62208]: warnings.warn( [ 2827.909700] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be812a39-48b5-47f2-9b89-8e957d1c0594 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.912122] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.912122] nova-compute[62208]: warnings.warn( [ 2827.917293] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baee3f97-bb11-42c8-9d04-4ed66c42c19b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2827.921096] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2827.921096] nova-compute[62208]: warnings.warn( [ 2827.930348] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2827.938606] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2827.953833] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.232s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2827.954355] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] Traceback (most recent call last): [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] self.driver.spawn(context, instance, image_meta, [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] self._fetch_image_if_missing(context, vi) [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] image_cache(vi, tmp_image_ds_loc) [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] vm_util.copy_virtual_disk( [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] session._wait_for_task(vmdk_copy_task) [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] return self.wait_for_task(task_ref) [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] return evt.wait() [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] result = hub.switch() [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] return self.greenlet.switch() [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] self.f(*self.args, **self.kw) [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] raise exceptions.translate_fault(task_info.error) [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] Faults: ['InvalidArgument'] [ 2827.954355] nova-compute[62208]: ERROR nova.compute.manager [instance: c2db95e8-c625-4c06-bded-237af38df144] [ 2827.955434] nova-compute[62208]: DEBUG nova.compute.utils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2827.956445] nova-compute[62208]: DEBUG nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Build of instance c2db95e8-c625-4c06-bded-237af38df144 was re-scheduled: A specified parameter was not correct: fileType [ 2827.956445] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2827.956819] nova-compute[62208]: DEBUG nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2827.956991] nova-compute[62208]: DEBUG nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2827.957161] nova-compute[62208]: DEBUG nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2827.957326] nova-compute[62208]: DEBUG nova.network.neutron [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2828.141178] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2828.221490] nova-compute[62208]: DEBUG nova.network.neutron [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2828.233778] nova-compute[62208]: INFO nova.compute.manager [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Took 0.28 seconds to deallocate network for instance. [ 2828.335987] nova-compute[62208]: INFO nova.scheduler.client.report [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Deleted allocations for instance c2db95e8-c625-4c06-bded-237af38df144 [ 2828.356115] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-22021719-a082-4ef1-9214-224cacb5203a tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "c2db95e8-c625-4c06-bded-237af38df144" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 238.398s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2828.356403] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8dffaec2-0620-4da5-a065-7d27eea93f54 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "c2db95e8-c625-4c06-bded-237af38df144" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 42.849s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2828.356665] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8dffaec2-0620-4da5-a065-7d27eea93f54 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "c2db95e8-c625-4c06-bded-237af38df144-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2828.356874] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8dffaec2-0620-4da5-a065-7d27eea93f54 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "c2db95e8-c625-4c06-bded-237af38df144-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2828.357039] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8dffaec2-0620-4da5-a065-7d27eea93f54 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "c2db95e8-c625-4c06-bded-237af38df144-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2828.359025] nova-compute[62208]: INFO nova.compute.manager [None req-8dffaec2-0620-4da5-a065-7d27eea93f54 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Terminating instance [ 2828.360680] nova-compute[62208]: DEBUG nova.compute.manager [None req-8dffaec2-0620-4da5-a065-7d27eea93f54 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2828.360866] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8dffaec2-0620-4da5-a065-7d27eea93f54 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2828.361602] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f9bf104f-5f9a-447d-bcb6-14cf7be94f55 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2828.364099] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2828.364099] nova-compute[62208]: warnings.warn( [ 2828.371515] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c0aa286-0901-401c-a702-831179baee9a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2828.383125] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2828.383125] nova-compute[62208]: warnings.warn( [ 2828.398409] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-8dffaec2-0620-4da5-a065-7d27eea93f54 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c2db95e8-c625-4c06-bded-237af38df144 could not be found. [ 2828.398698] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8dffaec2-0620-4da5-a065-7d27eea93f54 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2828.398898] nova-compute[62208]: INFO nova.compute.manager [None req-8dffaec2-0620-4da5-a065-7d27eea93f54 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: c2db95e8-c625-4c06-bded-237af38df144] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2828.399154] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-8dffaec2-0620-4da5-a065-7d27eea93f54 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2828.399373] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: c2db95e8-c625-4c06-bded-237af38df144] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2828.399475] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: c2db95e8-c625-4c06-bded-237af38df144] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2828.430447] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: c2db95e8-c625-4c06-bded-237af38df144] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2828.438781] nova-compute[62208]: INFO nova.compute.manager [-] [instance: c2db95e8-c625-4c06-bded-237af38df144] Took 0.04 seconds to deallocate network for instance. [ 2828.525829] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8dffaec2-0620-4da5-a065-7d27eea93f54 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "c2db95e8-c625-4c06-bded-237af38df144" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.169s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2829.141089] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2830.141688] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2830.142132] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2830.264334] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "20e9ed05-3592-4e84-8806-0e30c7563b85" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2830.264632] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "20e9ed05-3592-4e84-8806-0e30c7563b85" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2830.276347] nova-compute[62208]: DEBUG nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 2830.329331] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2830.329676] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2830.331231] nova-compute[62208]: INFO nova.compute.claims [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2830.440446] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d118d48f-d2f6-42ba-9d4b-bb3f3740b78d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2830.442961] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2830.442961] nova-compute[62208]: warnings.warn( [ 2830.448184] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71945224-889e-4092-b0e4-84f2850ca6ac {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2830.451278] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2830.451278] nova-compute[62208]: warnings.warn( [ 2830.477006] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a035ad4-a8bd-400f-bd5b-aa6f6c3552df {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2830.479912] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2830.479912] nova-compute[62208]: warnings.warn( [ 2830.484947] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f631664-4b08-46b2-90b2-1d4a1a63a49f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2830.489835] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2830.489835] nova-compute[62208]: warnings.warn( [ 2830.499382] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2830.508124] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2830.525892] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2830.526477] nova-compute[62208]: DEBUG nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 2830.565179] nova-compute[62208]: DEBUG nova.compute.utils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2830.566488] nova-compute[62208]: DEBUG nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2830.566666] nova-compute[62208]: DEBUG nova.network.neutron [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2830.578370] nova-compute[62208]: DEBUG nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 2830.611524] nova-compute[62208]: DEBUG nova.policy [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd51a1b598f0e44c28e2b6cdcbe7ac23e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a811291fa75242d5b998655672131068', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 2830.649106] nova-compute[62208]: DEBUG nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 2830.669936] nova-compute[62208]: DEBUG nova.virt.hardware [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2830.670181] nova-compute[62208]: DEBUG nova.virt.hardware [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2830.670338] nova-compute[62208]: DEBUG nova.virt.hardware [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2830.670517] nova-compute[62208]: DEBUG nova.virt.hardware [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2830.670666] nova-compute[62208]: DEBUG nova.virt.hardware [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2830.670835] nova-compute[62208]: DEBUG nova.virt.hardware [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2830.671064] nova-compute[62208]: DEBUG nova.virt.hardware [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2830.671223] nova-compute[62208]: DEBUG nova.virt.hardware [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2830.671392] nova-compute[62208]: DEBUG nova.virt.hardware [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2830.671555] nova-compute[62208]: DEBUG nova.virt.hardware [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2830.671728] nova-compute[62208]: DEBUG nova.virt.hardware [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2830.672588] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a625bb7-a659-4b9f-8e44-cdae19055c20 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2830.674938] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2830.674938] nova-compute[62208]: warnings.warn( [ 2830.680846] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-662a901d-a605-4f6b-b31b-1367af9a9ce3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2830.686012] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2830.686012] nova-compute[62208]: warnings.warn( [ 2830.980545] nova-compute[62208]: DEBUG nova.network.neutron [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Successfully created port: 66b22077-7e60-4621-875e-429bd3311b9e {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2831.141414] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2831.608937] nova-compute[62208]: DEBUG nova.compute.manager [req-b6626f57-3f97-4888-9585-2fc319b2d0c1 req-53fa7355-b6ed-4842-8363-b1db51b764a1 service nova] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Received event network-vif-plugged-66b22077-7e60-4621-875e-429bd3311b9e {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2831.609283] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b6626f57-3f97-4888-9585-2fc319b2d0c1 req-53fa7355-b6ed-4842-8363-b1db51b764a1 service nova] Acquiring lock "20e9ed05-3592-4e84-8806-0e30c7563b85-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2831.609366] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b6626f57-3f97-4888-9585-2fc319b2d0c1 req-53fa7355-b6ed-4842-8363-b1db51b764a1 service nova] Lock "20e9ed05-3592-4e84-8806-0e30c7563b85-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2831.609688] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-b6626f57-3f97-4888-9585-2fc319b2d0c1 req-53fa7355-b6ed-4842-8363-b1db51b764a1 service nova] Lock "20e9ed05-3592-4e84-8806-0e30c7563b85-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2831.609867] nova-compute[62208]: DEBUG nova.compute.manager [req-b6626f57-3f97-4888-9585-2fc319b2d0c1 req-53fa7355-b6ed-4842-8363-b1db51b764a1 service nova] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] No waiting events found dispatching network-vif-plugged-66b22077-7e60-4621-875e-429bd3311b9e {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2831.610139] nova-compute[62208]: WARNING nova.compute.manager [req-b6626f57-3f97-4888-9585-2fc319b2d0c1 req-53fa7355-b6ed-4842-8363-b1db51b764a1 service nova] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Received unexpected event network-vif-plugged-66b22077-7e60-4621-875e-429bd3311b9e for instance with vm_state building and task_state spawning. [ 2831.703653] nova-compute[62208]: DEBUG nova.network.neutron [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Successfully updated port: 66b22077-7e60-4621-875e-429bd3311b9e {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2831.723768] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "refresh_cache-20e9ed05-3592-4e84-8806-0e30c7563b85" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2831.723976] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquired lock "refresh_cache-20e9ed05-3592-4e84-8806-0e30c7563b85" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2831.724144] nova-compute[62208]: DEBUG nova.network.neutron [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2831.752661] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquiring lock "d238f196-f7e6-455b-b514-e8475e204e82" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2831.773323] nova-compute[62208]: DEBUG nova.network.neutron [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2831.959161] nova-compute[62208]: DEBUG nova.network.neutron [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Updating instance_info_cache with network_info: [{"id": "66b22077-7e60-4621-875e-429bd3311b9e", "address": "fa:16:3e:76:a7:07", "network": {"id": "3920080a-4a37-46bb-98d8-615215934385", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-635270513-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a811291fa75242d5b998655672131068", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e885ebd4-93ca-4e9e-8889-0f16bd91e61e", "external-id": "nsx-vlan-transportzone-580", "segmentation_id": 580, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap66b22077-7e", "ovs_interfaceid": "66b22077-7e60-4621-875e-429bd3311b9e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2831.973600] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Releasing lock "refresh_cache-20e9ed05-3592-4e84-8806-0e30c7563b85" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2831.973961] nova-compute[62208]: DEBUG nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Instance network_info: |[{"id": "66b22077-7e60-4621-875e-429bd3311b9e", "address": "fa:16:3e:76:a7:07", "network": {"id": "3920080a-4a37-46bb-98d8-615215934385", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-635270513-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a811291fa75242d5b998655672131068", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e885ebd4-93ca-4e9e-8889-0f16bd91e61e", "external-id": "nsx-vlan-transportzone-580", "segmentation_id": 580, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap66b22077-7e", "ovs_interfaceid": "66b22077-7e60-4621-875e-429bd3311b9e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 2831.974383] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:76:a7:07', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e885ebd4-93ca-4e9e-8889-0f16bd91e61e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '66b22077-7e60-4621-875e-429bd3311b9e', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2831.982128] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2831.982723] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2831.982997] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1919d534-c761-4827-8130-a68174ea3a08 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2831.999448] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2831.999448] nova-compute[62208]: warnings.warn( [ 2832.006541] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2832.006541] nova-compute[62208]: value = "task-38741" [ 2832.006541] nova-compute[62208]: _type = "Task" [ 2832.006541] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2832.009984] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2832.009984] nova-compute[62208]: warnings.warn( [ 2832.016536] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38741, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2832.510530] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2832.510530] nova-compute[62208]: warnings.warn( [ 2832.516572] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38741, 'name': CreateVM_Task, 'duration_secs': 0.293923} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2832.516739] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2832.517337] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2832.517560] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2832.520370] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6af07a9f-de65-4de7-a649-2ef1df728e64 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2832.530285] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2832.530285] nova-compute[62208]: warnings.warn( [ 2832.548618] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Reconfiguring VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2832.548956] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-5880fbee-36d3-44e0-8836-63daa6f780da {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2832.558843] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2832.558843] nova-compute[62208]: warnings.warn( [ 2832.564062] nova-compute[62208]: DEBUG oslo_vmware.api [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Waiting for the task: (returnval){ [ 2832.564062] nova-compute[62208]: value = "task-38742" [ 2832.564062] nova-compute[62208]: _type = "Task" [ 2832.564062] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2832.567036] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2832.567036] nova-compute[62208]: warnings.warn( [ 2832.572098] nova-compute[62208]: DEBUG oslo_vmware.api [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': task-38742, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2833.068355] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2833.068355] nova-compute[62208]: warnings.warn( [ 2833.074943] nova-compute[62208]: DEBUG oslo_vmware.api [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': task-38742, 'name': ReconfigVM_Task, 'duration_secs': 0.104177} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2833.075292] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Reconfigured VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2833.075517] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.558s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2833.075814] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2833.075964] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2833.076343] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2833.076743] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2169e314-587f-4157-9a8f-50fe8700fd7a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2833.078515] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2833.078515] nova-compute[62208]: warnings.warn( [ 2833.082641] nova-compute[62208]: DEBUG oslo_vmware.api [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Waiting for the task: (returnval){ [ 2833.082641] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5250b20f-4bba-af79-4c17-b1b2a08f2d54" [ 2833.082641] nova-compute[62208]: _type = "Task" [ 2833.082641] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2833.086040] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2833.086040] nova-compute[62208]: warnings.warn( [ 2833.091141] nova-compute[62208]: DEBUG oslo_vmware.api [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5250b20f-4bba-af79-4c17-b1b2a08f2d54, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2833.587227] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2833.587227] nova-compute[62208]: warnings.warn( [ 2833.593722] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2833.593925] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2833.594145] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2833.635806] nova-compute[62208]: DEBUG nova.compute.manager [req-8ab33a6a-7ecd-4688-96a0-66fb4d86cd07 req-fdf408fb-e2ed-4124-9a15-f00fd87be5d9 service nova] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Received event network-changed-66b22077-7e60-4621-875e-429bd3311b9e {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2833.635990] nova-compute[62208]: DEBUG nova.compute.manager [req-8ab33a6a-7ecd-4688-96a0-66fb4d86cd07 req-fdf408fb-e2ed-4124-9a15-f00fd87be5d9 service nova] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Refreshing instance network info cache due to event network-changed-66b22077-7e60-4621-875e-429bd3311b9e. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 2833.636189] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-8ab33a6a-7ecd-4688-96a0-66fb4d86cd07 req-fdf408fb-e2ed-4124-9a15-f00fd87be5d9 service nova] Acquiring lock "refresh_cache-20e9ed05-3592-4e84-8806-0e30c7563b85" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2833.636332] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-8ab33a6a-7ecd-4688-96a0-66fb4d86cd07 req-fdf408fb-e2ed-4124-9a15-f00fd87be5d9 service nova] Acquired lock "refresh_cache-20e9ed05-3592-4e84-8806-0e30c7563b85" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2833.636494] nova-compute[62208]: DEBUG nova.network.neutron [req-8ab33a6a-7ecd-4688-96a0-66fb4d86cd07 req-fdf408fb-e2ed-4124-9a15-f00fd87be5d9 service nova] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Refreshing network info cache for port 66b22077-7e60-4621-875e-429bd3311b9e {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2833.851724] nova-compute[62208]: DEBUG nova.network.neutron [req-8ab33a6a-7ecd-4688-96a0-66fb4d86cd07 req-fdf408fb-e2ed-4124-9a15-f00fd87be5d9 service nova] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Updated VIF entry in instance network info cache for port 66b22077-7e60-4621-875e-429bd3311b9e. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2833.852100] nova-compute[62208]: DEBUG nova.network.neutron [req-8ab33a6a-7ecd-4688-96a0-66fb4d86cd07 req-fdf408fb-e2ed-4124-9a15-f00fd87be5d9 service nova] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Updating instance_info_cache with network_info: [{"id": "66b22077-7e60-4621-875e-429bd3311b9e", "address": "fa:16:3e:76:a7:07", "network": {"id": "3920080a-4a37-46bb-98d8-615215934385", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-635270513-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a811291fa75242d5b998655672131068", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e885ebd4-93ca-4e9e-8889-0f16bd91e61e", "external-id": "nsx-vlan-transportzone-580", "segmentation_id": 580, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap66b22077-7e", "ovs_interfaceid": "66b22077-7e60-4621-875e-429bd3311b9e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2833.861450] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-8ab33a6a-7ecd-4688-96a0-66fb4d86cd07 req-fdf408fb-e2ed-4124-9a15-f00fd87be5d9 service nova] Releasing lock "refresh_cache-20e9ed05-3592-4e84-8806-0e30c7563b85" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2834.141528] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2834.151645] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2834.151845] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2834.152034] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2834.152198] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2834.153276] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77802f34-2589-47dd-b569-dc3d623556d6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2834.156428] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2834.156428] nova-compute[62208]: warnings.warn( [ 2834.162271] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9f98cce-0059-4f25-b85f-187d3a9f39cd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2834.165792] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2834.165792] nova-compute[62208]: warnings.warn( [ 2834.177350] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37ef0971-b57b-4437-9a69-2f742450980d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2834.179606] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2834.179606] nova-compute[62208]: warnings.warn( [ 2834.183862] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-453d0529-0e32-4cf3-a7d8-9e73f8a39ae7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2834.186573] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2834.186573] nova-compute[62208]: warnings.warn( [ 2834.212475] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181968MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2834.212651] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2834.212825] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2834.263065] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d238f196-f7e6-455b-b514-e8475e204e82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2834.263230] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d5289eeb-c269-431e-9a8e-d27487e12b2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2834.263393] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4d141090-57cf-442a-a03e-6151d29f2266 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2834.263536] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 20e9ed05-3592-4e84-8806-0e30c7563b85 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2834.263721] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2834.263857] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2834.326110] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e87ae4d-cac8-4154-ba8c-f8882010021b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2834.328599] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2834.328599] nova-compute[62208]: warnings.warn( [ 2834.333719] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00ed1b47-ea10-4bfb-8e50-af180ed4c5e7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2834.336641] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2834.336641] nova-compute[62208]: warnings.warn( [ 2834.363582] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e7d0502-a6f8-4251-8d83-3ea9c0721eda {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2834.365935] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2834.365935] nova-compute[62208]: warnings.warn( [ 2834.370808] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4ab306e-6bd6-496a-9638-f6183300b0ae {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2834.374284] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2834.374284] nova-compute[62208]: warnings.warn( [ 2834.383896] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2834.393434] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2834.411057] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2834.411191] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2840.407512] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2874.984090] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2874.984090] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2874.984090] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2874.984090] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2874.984090] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2874.984090] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2874.984090] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2874.984090] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2874.984090] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2874.984090] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2874.984090] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2874.984090] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2874.985418] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/5db6c166-95f6-45bb-9d67-f8df2d292333/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2874.986194] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2874.986444] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Copying Virtual Disk [datastore2] vmware_temp/5db6c166-95f6-45bb-9d67-f8df2d292333/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/5db6c166-95f6-45bb-9d67-f8df2d292333/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2874.986752] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7def8726-1162-4f76-9c55-cbde0ae13b29 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2874.989267] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2874.989267] nova-compute[62208]: warnings.warn( [ 2874.995345] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Waiting for the task: (returnval){ [ 2874.995345] nova-compute[62208]: value = "task-38743" [ 2874.995345] nova-compute[62208]: _type = "Task" [ 2874.995345] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2874.998418] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2874.998418] nova-compute[62208]: warnings.warn( [ 2875.003783] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Task: {'id': task-38743, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2875.499587] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2875.499587] nova-compute[62208]: warnings.warn( [ 2875.505432] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2875.505716] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2875.506277] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] Traceback (most recent call last): [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] yield resources [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] self.driver.spawn(context, instance, image_meta, [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] self._fetch_image_if_missing(context, vi) [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] image_cache(vi, tmp_image_ds_loc) [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] vm_util.copy_virtual_disk( [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] session._wait_for_task(vmdk_copy_task) [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] return self.wait_for_task(task_ref) [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] return evt.wait() [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] result = hub.switch() [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] return self.greenlet.switch() [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] self.f(*self.args, **self.kw) [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] raise exceptions.translate_fault(task_info.error) [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] Faults: ['InvalidArgument'] [ 2875.506277] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] [ 2875.507421] nova-compute[62208]: INFO nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Terminating instance [ 2875.508156] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2875.508370] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2875.508639] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-20a196de-0dd5-4df3-8750-495f2788205f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2875.510730] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquiring lock "refresh_cache-d238f196-f7e6-455b-b514-e8475e204e82" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2875.510925] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquired lock "refresh_cache-d238f196-f7e6-455b-b514-e8475e204e82" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2875.511103] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2875.511976] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2875.511976] nova-compute[62208]: warnings.warn( [ 2875.518119] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2875.518257] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2875.519486] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2f7bbfed-9427-4c23-ba3b-d96f43f25332 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2875.524116] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2875.524116] nova-compute[62208]: warnings.warn( [ 2875.527406] nova-compute[62208]: DEBUG oslo_vmware.api [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2875.527406] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52adada4-b6e8-48ab-0f69-a5316e849d1f" [ 2875.527406] nova-compute[62208]: _type = "Task" [ 2875.527406] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2875.530202] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2875.530202] nova-compute[62208]: warnings.warn( [ 2875.535136] nova-compute[62208]: DEBUG oslo_vmware.api [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52adada4-b6e8-48ab-0f69-a5316e849d1f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2875.541216] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2875.564289] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2875.573485] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Releasing lock "refresh_cache-d238f196-f7e6-455b-b514-e8475e204e82" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2875.573878] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2875.574065] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2875.575102] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc758e2b-7b13-4051-b1ab-0729f9421386 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2875.577841] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2875.577841] nova-compute[62208]: warnings.warn( [ 2875.582559] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2875.582781] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-801822f0-6f87-41ea-8716-fae92ea6be59 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2875.584147] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2875.584147] nova-compute[62208]: warnings.warn( [ 2875.609516] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2875.609787] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2875.609972] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Deleting the datastore file [datastore2] d238f196-f7e6-455b-b514-e8475e204e82 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2875.610232] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7f7c6a40-ec42-4f0c-89fd-63b5f8276dc2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2875.612103] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2875.612103] nova-compute[62208]: warnings.warn( [ 2875.616548] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Waiting for the task: (returnval){ [ 2875.616548] nova-compute[62208]: value = "task-38745" [ 2875.616548] nova-compute[62208]: _type = "Task" [ 2875.616548] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2875.621117] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2875.621117] nova-compute[62208]: warnings.warn( [ 2875.625931] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Task: {'id': task-38745, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2876.031088] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.031088] nova-compute[62208]: warnings.warn( [ 2876.037570] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2876.037829] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating directory with path [datastore2] vmware_temp/ff1cfb92-ef81-478f-9ee3-a36481d1efce/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2876.038061] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-063612b3-f7d5-42ee-8a49-f86d04f97e4e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2876.039750] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.039750] nova-compute[62208]: warnings.warn( [ 2876.049196] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Created directory with path [datastore2] vmware_temp/ff1cfb92-ef81-478f-9ee3-a36481d1efce/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2876.049391] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Fetch image to [datastore2] vmware_temp/ff1cfb92-ef81-478f-9ee3-a36481d1efce/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2876.049598] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/ff1cfb92-ef81-478f-9ee3-a36481d1efce/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2876.050324] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7eaa85e4-f216-4ef9-9361-8b76ae15603d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2876.052579] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.052579] nova-compute[62208]: warnings.warn( [ 2876.057314] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-037dac0e-8f5a-4dc5-bb82-07e0354a4be7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2876.059486] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.059486] nova-compute[62208]: warnings.warn( [ 2876.066325] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd74dc08-b650-4d90-a265-340046659742 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2876.069781] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.069781] nova-compute[62208]: warnings.warn( [ 2876.096470] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-135648ff-bd01-4006-b297-72ad6d5a2d61 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2876.098838] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.098838] nova-compute[62208]: warnings.warn( [ 2876.102509] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-454f2b41-ed4d-4dc3-a427-ded93ba2d2ed {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2876.104058] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.104058] nova-compute[62208]: warnings.warn( [ 2876.120815] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.120815] nova-compute[62208]: warnings.warn( [ 2876.124039] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2876.128876] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Task: {'id': task-38745, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.034604} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2876.129114] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2876.129297] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2876.129480] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2876.129684] nova-compute[62208]: INFO nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Took 0.56 seconds to destroy the instance on the hypervisor. [ 2876.129921] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2876.130125] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Skipping network deallocation for instance since networking was not requested. {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2276}} [ 2876.132197] nova-compute[62208]: DEBUG nova.compute.claims [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Aborting claim: <nova.compute.claims.Claim object at 0x7fb935d04880> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2876.132364] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2876.132576] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2876.173366] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ff1cfb92-ef81-478f-9ee3-a36481d1efce/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2876.234239] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2876.234468] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ff1cfb92-ef81-478f-9ee3-a36481d1efce/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2876.275775] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01086f6c-e402-48b9-b80e-24072baec6b0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2876.278855] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.278855] nova-compute[62208]: warnings.warn( [ 2876.283853] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93164800-d5c7-47bc-ad81-bb3ba2108fd5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2876.286767] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.286767] nova-compute[62208]: warnings.warn( [ 2876.313914] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb7ebba5-207b-4184-9ead-5d32ceca03ff {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2876.316332] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.316332] nova-compute[62208]: warnings.warn( [ 2876.321836] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4ba9231-e41b-4fd0-b1a2-fc72f045cd93 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2876.325501] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.325501] nova-compute[62208]: warnings.warn( [ 2876.335471] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2876.343938] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2876.359187] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.226s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2876.359715] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] Traceback (most recent call last): [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] self.driver.spawn(context, instance, image_meta, [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] self._fetch_image_if_missing(context, vi) [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] image_cache(vi, tmp_image_ds_loc) [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] vm_util.copy_virtual_disk( [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] session._wait_for_task(vmdk_copy_task) [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] return self.wait_for_task(task_ref) [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] return evt.wait() [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] result = hub.switch() [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] return self.greenlet.switch() [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] self.f(*self.args, **self.kw) [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] raise exceptions.translate_fault(task_info.error) [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] Faults: ['InvalidArgument'] [ 2876.359715] nova-compute[62208]: ERROR nova.compute.manager [instance: d238f196-f7e6-455b-b514-e8475e204e82] [ 2876.360829] nova-compute[62208]: DEBUG nova.compute.utils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2876.362699] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Build of instance d238f196-f7e6-455b-b514-e8475e204e82 was re-scheduled: A specified parameter was not correct: fileType [ 2876.362699] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2876.363103] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2876.363337] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquiring lock "refresh_cache-d238f196-f7e6-455b-b514-e8475e204e82" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2876.363487] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquired lock "refresh_cache-d238f196-f7e6-455b-b514-e8475e204e82" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2876.363648] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2876.393796] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2876.420344] nova-compute[62208]: DEBUG nova.network.neutron [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2876.429222] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Releasing lock "refresh_cache-d238f196-f7e6-455b-b514-e8475e204e82" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2876.429431] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2876.429647] nova-compute[62208]: DEBUG nova.compute.manager [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Skipping network deallocation for instance since networking was not requested. {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2276}} [ 2876.517231] nova-compute[62208]: INFO nova.scheduler.client.report [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Deleted allocations for instance d238f196-f7e6-455b-b514-e8475e204e82 [ 2876.536405] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cbe47415-1abb-4f41-a849-85955f1b4f33 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Lock "d238f196-f7e6-455b-b514-e8475e204e82" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 240.482s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2876.536687] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Lock "d238f196-f7e6-455b-b514-e8475e204e82" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 44.784s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2876.536915] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquiring lock "d238f196-f7e6-455b-b514-e8475e204e82-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2876.537125] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Lock "d238f196-f7e6-455b-b514-e8475e204e82-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2876.537296] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Lock "d238f196-f7e6-455b-b514-e8475e204e82-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2876.539140] nova-compute[62208]: INFO nova.compute.manager [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Terminating instance [ 2876.540646] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquiring lock "refresh_cache-d238f196-f7e6-455b-b514-e8475e204e82" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2876.540822] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Acquired lock "refresh_cache-d238f196-f7e6-455b-b514-e8475e204e82" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2876.541000] nova-compute[62208]: DEBUG nova.network.neutron [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2876.569329] nova-compute[62208]: DEBUG nova.network.neutron [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2876.592569] nova-compute[62208]: DEBUG nova.network.neutron [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2876.600602] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Releasing lock "refresh_cache-d238f196-f7e6-455b-b514-e8475e204e82" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2876.600981] nova-compute[62208]: DEBUG nova.compute.manager [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2876.601175] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2876.601662] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b911fc4a-b151-4c0b-8f72-369b402ba830 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2876.603487] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.603487] nova-compute[62208]: warnings.warn( [ 2876.609964] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5894befb-11d2-4334-9091-4798bc2d76b2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2876.620148] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2876.620148] nova-compute[62208]: warnings.warn( [ 2876.635225] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d238f196-f7e6-455b-b514-e8475e204e82 could not be found. [ 2876.635417] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2876.635595] nova-compute[62208]: INFO nova.compute.manager [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Took 0.03 seconds to destroy the instance on the hypervisor. [ 2876.635835] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2876.636045] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2876.636142] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: d238f196-f7e6-455b-b514-e8475e204e82] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2876.655496] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2876.662718] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2876.670634] nova-compute[62208]: INFO nova.compute.manager [-] [instance: d238f196-f7e6-455b-b514-e8475e204e82] Took 0.03 seconds to deallocate network for instance. [ 2876.762089] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6665079b-aa71-407f-954a-56cd762daab6 tempest-ServerShowV257Test-1055589823 tempest-ServerShowV257Test-1055589823-project-member] Lock "d238f196-f7e6-455b-b514-e8475e204e82" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.225s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2879.610767] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b7c4b559-dbc2-43d6-9c07-a5ce77276267 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "d5289eeb-c269-431e-9a8e-d27487e12b2a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2880.140478] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2883.140600] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2883.140950] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2883.140950] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2883.153933] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2883.154097] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2883.154229] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2883.154357] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2883.154837] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2886.151408] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2888.140949] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2890.143022] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2890.143438] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2890.143438] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2892.141957] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2896.142019] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2896.152608] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2896.152864] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2896.153044] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2896.153205] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2896.154388] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bdb756a-f9cb-4476-9b54-73320872f01d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2896.157656] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2896.157656] nova-compute[62208]: warnings.warn( [ 2896.163903] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18b7b507-6d6f-44aa-9a77-5e0f8fa3ba1e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2896.167782] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2896.167782] nova-compute[62208]: warnings.warn( [ 2896.178648] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa61a014-5831-437a-b880-831d929af703 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2896.181033] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2896.181033] nova-compute[62208]: warnings.warn( [ 2896.185596] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42ca1fc8-898a-4ee5-9e0d-fb3b76a0fd92 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2896.188710] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2896.188710] nova-compute[62208]: warnings.warn( [ 2896.215919] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181965MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2896.216129] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2896.216294] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2896.263823] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance d5289eeb-c269-431e-9a8e-d27487e12b2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2896.264348] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4d141090-57cf-442a-a03e-6151d29f2266 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2896.264637] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 20e9ed05-3592-4e84-8806-0e30c7563b85 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2896.264950] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2896.265238] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2896.327086] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b320c777-ae68-4137-8bd4-3088c62ac4ee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2896.329938] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2896.329938] nova-compute[62208]: warnings.warn( [ 2896.335456] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8023b0f6-1185-49d2-9b7b-866240b7faff {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2896.338772] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2896.338772] nova-compute[62208]: warnings.warn( [ 2896.366686] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d527754d-07a6-4103-aaf0-be281067dfad {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2896.369243] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2896.369243] nova-compute[62208]: warnings.warn( [ 2896.375227] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-593917ae-1a4e-4fc6-8546-e62b04f64fe5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2896.379230] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2896.379230] nova-compute[62208]: warnings.warn( [ 2896.389822] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2896.399228] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2896.417939] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2896.418230] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.202s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2911.141708] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._run_image_cache_manager_pass {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2911.142120] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2911.142602] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2911.142914] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2911.143065] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2911.143333] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2911.143577] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2911.160591] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c417d869-996b-45ef-881f-b219e40a4b38 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2911.163325] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2911.163325] nova-compute[62208]: warnings.warn( [ 2911.169994] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b24b7927-fb6a-465c-9801-cd22cf13c8a3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2911.173955] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2911.173955] nova-compute[62208]: warnings.warn( [ 2911.197208] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-65e3bf4e-804e-40a7-b91a-1d7557afca3e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2911.199042] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2911.199042] nova-compute[62208]: warnings.warn( [ 2911.202574] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Waiting for the task: (returnval){ [ 2911.202574] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52d5fa81-b588-9ef1-98e4-923b78aa5a25" [ 2911.202574] nova-compute[62208]: _type = "Task" [ 2911.202574] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2911.205595] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2911.205595] nova-compute[62208]: warnings.warn( [ 2911.210536] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52d5fa81-b588-9ef1-98e4-923b78aa5a25, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2911.706889] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2911.706889] nova-compute[62208]: warnings.warn( [ 2911.712898] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52d5fa81-b588-9ef1-98e4-923b78aa5a25, 'name': SearchDatastore_Task, 'duration_secs': 0.010312} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2911.713303] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2911.713430] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquired lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2911.713709] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquired external semaphore "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2911.714283] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f796c5e3-52e0-4a6b-a1cf-8814d4d90478 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2911.715909] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2911.715909] nova-compute[62208]: warnings.warn( [ 2911.719364] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Waiting for the task: (returnval){ [ 2911.719364] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52391e78-e069-cbf8-b2e3-41240a411349" [ 2911.719364] nova-compute[62208]: _type = "Task" [ 2911.719364] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2911.722667] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2911.722667] nova-compute[62208]: warnings.warn( [ 2911.727431] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52391e78-e069-cbf8-b2e3-41240a411349, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2912.224019] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2912.224019] nova-compute[62208]: warnings.warn( [ 2912.230222] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Releasing lock "[datastore1] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2912.230468] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "[datastore1] devstack-image-cache_base/" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2912.230591] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquired lock "[datastore1] devstack-image-cache_base/" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2912.230927] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquired external semaphore "[datastore1] devstack-image-cache_base/" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2912.231208] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6b86b9c6-40a6-48ac-bac2-6b87104a10da {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2912.233021] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2912.233021] nova-compute[62208]: warnings.warn( [ 2912.236162] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Waiting for the task: (returnval){ [ 2912.236162] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]525099b0-5bb5-3d88-68df-bd6c716b6fc6" [ 2912.236162] nova-compute[62208]: _type = "Task" [ 2912.236162] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2912.238931] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2912.238931] nova-compute[62208]: warnings.warn( [ 2912.244359] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]525099b0-5bb5-3d88-68df-bd6c716b6fc6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2912.740664] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2912.740664] nova-compute[62208]: warnings.warn( [ 2912.746661] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]525099b0-5bb5-3d88-68df-bd6c716b6fc6, 'name': SearchDatastore_Task, 'duration_secs': 0.015771} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2912.746917] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Releasing lock "[datastore1] devstack-image-cache_base/" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2912.747218] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-acb92545-6637-40ea-b56e-f69428b88171 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2912.748792] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2912.748792] nova-compute[62208]: warnings.warn( [ 2912.751902] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Waiting for the task: (returnval){ [ 2912.751902] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]520962a8-9e81-e56b-e6db-2c70609102b9" [ 2912.751902] nova-compute[62208]: _type = "Task" [ 2912.751902] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2912.754674] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2912.754674] nova-compute[62208]: warnings.warn( [ 2912.759639] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]520962a8-9e81-e56b-e6db-2c70609102b9, 'name': SearchDatastore_Task, 'duration_secs': 0.00569} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2912.759930] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2912.760070] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2912.760354] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2912.760612] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-63a145d4-6e86-4286-86bb-5c0c746477bd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2912.762197] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2912.762197] nova-compute[62208]: warnings.warn( [ 2912.765031] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Waiting for the task: (returnval){ [ 2912.765031] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52bbad55-243a-1aa7-2060-71d6b19469fd" [ 2912.765031] nova-compute[62208]: _type = "Task" [ 2912.765031] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2912.767677] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2912.767677] nova-compute[62208]: warnings.warn( [ 2912.772286] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52bbad55-243a-1aa7-2060-71d6b19469fd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2913.269151] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2913.269151] nova-compute[62208]: warnings.warn( [ 2913.276655] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2913.276842] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "[datastore2] devstack-image-cache_base/" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2913.276962] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquired lock "[datastore2] devstack-image-cache_base/" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2913.277266] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquired external semaphore "[datastore2] devstack-image-cache_base/" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2913.277532] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4359304a-52cd-44e6-8a54-4f0cfc0e0172 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2913.279009] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2913.279009] nova-compute[62208]: warnings.warn( [ 2913.281880] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Waiting for the task: (returnval){ [ 2913.281880] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52a9f917-e244-b2d1-1499-dcf0205cec66" [ 2913.281880] nova-compute[62208]: _type = "Task" [ 2913.281880] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2913.284595] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2913.284595] nova-compute[62208]: warnings.warn( [ 2913.289492] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52a9f917-e244-b2d1-1499-dcf0205cec66, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2913.786421] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2913.786421] nova-compute[62208]: warnings.warn( [ 2913.792280] nova-compute[62208]: DEBUG oslo_vmware.api [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52a9f917-e244-b2d1-1499-dcf0205cec66, 'name': SearchDatastore_Task, 'duration_secs': 0.0141} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2913.792531] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Releasing lock "[datastore2] devstack-image-cache_base/" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2926.095846] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2926.095846] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2926.095846] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2926.095846] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2926.095846] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2926.095846] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2926.095846] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2926.095846] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2926.095846] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2926.095846] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2926.095846] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2926.095846] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2926.096651] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/ff1cfb92-ef81-478f-9ee3-a36481d1efce/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2926.098361] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2926.098642] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Copying Virtual Disk [datastore2] vmware_temp/ff1cfb92-ef81-478f-9ee3-a36481d1efce/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/ff1cfb92-ef81-478f-9ee3-a36481d1efce/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2926.098964] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8fb88284-033a-4873-80fe-07572d1916a2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2926.101129] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2926.101129] nova-compute[62208]: warnings.warn( [ 2926.107075] nova-compute[62208]: DEBUG oslo_vmware.api [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2926.107075] nova-compute[62208]: value = "task-38746" [ 2926.107075] nova-compute[62208]: _type = "Task" [ 2926.107075] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2926.110774] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2926.110774] nova-compute[62208]: warnings.warn( [ 2926.116221] nova-compute[62208]: DEBUG oslo_vmware.api [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38746, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2926.611259] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2926.611259] nova-compute[62208]: warnings.warn( [ 2926.617337] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2926.617619] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2926.618228] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Traceback (most recent call last): [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] yield resources [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] self.driver.spawn(context, instance, image_meta, [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] self._fetch_image_if_missing(context, vi) [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] image_cache(vi, tmp_image_ds_loc) [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] vm_util.copy_virtual_disk( [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] session._wait_for_task(vmdk_copy_task) [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] return self.wait_for_task(task_ref) [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] return evt.wait() [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] result = hub.switch() [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] return self.greenlet.switch() [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] self.f(*self.args, **self.kw) [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] raise exceptions.translate_fault(task_info.error) [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Faults: ['InvalidArgument'] [ 2926.618228] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] [ 2926.619430] nova-compute[62208]: INFO nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Terminating instance [ 2926.620144] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2926.620349] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2926.620591] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8719705f-d100-408d-b6a0-8a4b230b206a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2926.622795] nova-compute[62208]: DEBUG nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2926.622990] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2926.623791] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ce93649-6601-4e5b-bc0a-441b54a1befa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2926.626183] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2926.626183] nova-compute[62208]: warnings.warn( [ 2926.626552] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2926.626552] nova-compute[62208]: warnings.warn( [ 2926.630880] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2926.631102] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7d3d8ac9-d8fd-47bd-846e-8381955912e4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2926.633327] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2926.633500] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2926.634070] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2926.634070] nova-compute[62208]: warnings.warn( [ 2926.634474] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dc5c268a-2754-49f3-ba76-3c3c3f272f5a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2926.636396] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2926.636396] nova-compute[62208]: warnings.warn( [ 2926.639313] nova-compute[62208]: DEBUG oslo_vmware.api [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2926.639313] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]526859cc-3cf5-40ce-8988-bb78014eec0e" [ 2926.639313] nova-compute[62208]: _type = "Task" [ 2926.639313] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2926.642366] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2926.642366] nova-compute[62208]: warnings.warn( [ 2926.647593] nova-compute[62208]: DEBUG oslo_vmware.api [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]526859cc-3cf5-40ce-8988-bb78014eec0e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2926.706059] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2926.706280] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2926.706518] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleting the datastore file [datastore2] d5289eeb-c269-431e-9a8e-d27487e12b2a {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2926.706806] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0f10f511-7080-4919-aa6f-8502109a9fd8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2926.708665] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2926.708665] nova-compute[62208]: warnings.warn( [ 2926.714765] nova-compute[62208]: DEBUG oslo_vmware.api [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for the task: (returnval){ [ 2926.714765] nova-compute[62208]: value = "task-38748" [ 2926.714765] nova-compute[62208]: _type = "Task" [ 2926.714765] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2926.718006] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2926.718006] nova-compute[62208]: warnings.warn( [ 2926.723021] nova-compute[62208]: DEBUG oslo_vmware.api [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38748, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2927.144179] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.144179] nova-compute[62208]: warnings.warn( [ 2927.150183] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2927.150444] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating directory with path [datastore2] vmware_temp/2db4608d-3dcd-467b-8c5b-df085e6198b4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2927.150686] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fec1db06-1860-4f10-927d-8ec3f3ba2543 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2927.152866] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.152866] nova-compute[62208]: warnings.warn( [ 2927.162804] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created directory with path [datastore2] vmware_temp/2db4608d-3dcd-467b-8c5b-df085e6198b4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2927.163019] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Fetch image to [datastore2] vmware_temp/2db4608d-3dcd-467b-8c5b-df085e6198b4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2927.163195] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/2db4608d-3dcd-467b-8c5b-df085e6198b4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2927.164024] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54bfedb0-8005-4574-8ff8-a4fa83b966cc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2927.166373] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.166373] nova-compute[62208]: warnings.warn( [ 2927.170908] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b086e0e-1d42-4344-8c06-de859dbf0fe1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2927.173359] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.173359] nova-compute[62208]: warnings.warn( [ 2927.180655] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ade904e-96b3-4377-bfcf-880333abff02 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2927.184608] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.184608] nova-compute[62208]: warnings.warn( [ 2927.213574] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7ec6478-80c8-464d-b88b-dbae1544cf8f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2927.218695] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.218695] nova-compute[62208]: warnings.warn( [ 2927.219240] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.219240] nova-compute[62208]: warnings.warn( [ 2927.225465] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5f4e9c9f-191b-484d-aa1c-f332f7ccba60 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2927.227204] nova-compute[62208]: DEBUG oslo_vmware.api [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Task: {'id': task-38748, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082819} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2927.227460] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2927.227649] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2927.227820] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2927.227986] nova-compute[62208]: INFO nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2927.229495] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.229495] nova-compute[62208]: warnings.warn( [ 2927.230282] nova-compute[62208]: DEBUG nova.compute.claims [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Aborting claim: <nova.compute.claims.Claim object at 0x7fb935d4f160> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2927.230478] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2927.230698] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2927.252838] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2927.306432] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2db4608d-3dcd-467b-8c5b-df085e6198b4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2927.363159] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2927.363371] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2db4608d-3dcd-467b-8c5b-df085e6198b4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2927.413874] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62da9ebf-6d9c-45e5-be61-788773d125c8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2927.416393] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.416393] nova-compute[62208]: warnings.warn( [ 2927.422049] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cce5b86a-d150-4c03-b721-aec1411fa67b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2927.425233] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.425233] nova-compute[62208]: warnings.warn( [ 2927.451961] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d6e2594-fe9a-4aa1-9ea3-7d96dfde9ad1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2927.454380] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.454380] nova-compute[62208]: warnings.warn( [ 2927.459716] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb82741c-7b38-45fe-9184-5c06a3c0ec1d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2927.464407] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.464407] nova-compute[62208]: warnings.warn( [ 2927.473949] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2927.482441] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2927.498044] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.267s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2927.498792] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Traceback (most recent call last): [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] self.driver.spawn(context, instance, image_meta, [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] self._fetch_image_if_missing(context, vi) [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] image_cache(vi, tmp_image_ds_loc) [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] vm_util.copy_virtual_disk( [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] session._wait_for_task(vmdk_copy_task) [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] return self.wait_for_task(task_ref) [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] return evt.wait() [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] result = hub.switch() [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] return self.greenlet.switch() [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] self.f(*self.args, **self.kw) [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] raise exceptions.translate_fault(task_info.error) [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Faults: ['InvalidArgument'] [ 2927.498792] nova-compute[62208]: ERROR nova.compute.manager [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] [ 2927.499927] nova-compute[62208]: DEBUG nova.compute.utils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2927.501278] nova-compute[62208]: DEBUG nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Build of instance d5289eeb-c269-431e-9a8e-d27487e12b2a was re-scheduled: A specified parameter was not correct: fileType [ 2927.501278] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2927.501671] nova-compute[62208]: DEBUG nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2927.501873] nova-compute[62208]: DEBUG nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2927.502057] nova-compute[62208]: DEBUG nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2927.502227] nova-compute[62208]: DEBUG nova.network.neutron [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2927.758230] nova-compute[62208]: DEBUG nova.network.neutron [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2927.773251] nova-compute[62208]: INFO nova.compute.manager [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Took 0.27 seconds to deallocate network for instance. [ 2927.869826] nova-compute[62208]: INFO nova.scheduler.client.report [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Deleted allocations for instance d5289eeb-c269-431e-9a8e-d27487e12b2a [ 2927.892580] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-acd6d392-b889-4c50-b421-2fa8ba3e3d55 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "d5289eeb-c269-431e-9a8e-d27487e12b2a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 244.882s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2927.892963] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b7c4b559-dbc2-43d6-9c07-a5ce77276267 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "d5289eeb-c269-431e-9a8e-d27487e12b2a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 48.282s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2927.893308] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b7c4b559-dbc2-43d6-9c07-a5ce77276267 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Acquiring lock "d5289eeb-c269-431e-9a8e-d27487e12b2a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2927.893536] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b7c4b559-dbc2-43d6-9c07-a5ce77276267 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "d5289eeb-c269-431e-9a8e-d27487e12b2a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2927.893710] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b7c4b559-dbc2-43d6-9c07-a5ce77276267 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "d5289eeb-c269-431e-9a8e-d27487e12b2a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2927.895934] nova-compute[62208]: INFO nova.compute.manager [None req-b7c4b559-dbc2-43d6-9c07-a5ce77276267 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Terminating instance [ 2927.897863] nova-compute[62208]: DEBUG nova.compute.manager [None req-b7c4b559-dbc2-43d6-9c07-a5ce77276267 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2927.898088] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b7c4b559-dbc2-43d6-9c07-a5ce77276267 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2927.898593] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f45f8ff7-b58a-4a78-b728-2deebb3b3497 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2927.901176] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.901176] nova-compute[62208]: warnings.warn( [ 2927.908044] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8cd464a-b7ac-41b7-be18-2459485a1814 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2927.918326] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2927.918326] nova-compute[62208]: warnings.warn( [ 2927.934422] nova-compute[62208]: WARNING nova.virt.vmwareapi.vmops [None req-b7c4b559-dbc2-43d6-9c07-a5ce77276267 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d5289eeb-c269-431e-9a8e-d27487e12b2a could not be found. [ 2927.934641] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-b7c4b559-dbc2-43d6-9c07-a5ce77276267 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2927.934825] nova-compute[62208]: INFO nova.compute.manager [None req-b7c4b559-dbc2-43d6-9c07-a5ce77276267 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2927.935083] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-b7c4b559-dbc2-43d6-9c07-a5ce77276267 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2927.935612] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2927.935713] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2927.961747] nova-compute[62208]: DEBUG nova.network.neutron [-] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2927.969907] nova-compute[62208]: INFO nova.compute.manager [-] [instance: d5289eeb-c269-431e-9a8e-d27487e12b2a] Took 0.03 seconds to deallocate network for instance. [ 2928.066953] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-b7c4b559-dbc2-43d6-9c07-a5ce77276267 tempest-ServersTestJSON-1219570964 tempest-ServersTestJSON-1219570964-project-member] Lock "d5289eeb-c269-431e-9a8e-d27487e12b2a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.174s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2944.793930] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2944.794318] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 2944.794318] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 2944.806348] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2944.806538] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 2944.806675] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 2944.807161] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2945.141209] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2948.136841] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2948.140513] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2950.141780] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2950.142176] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2950.142176] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 2952.141526] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2957.142380] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2957.153047] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2957.153315] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2957.153468] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2957.153634] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2957.155156] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dc3e011-38ca-4903-a40a-ba382e3d9ee8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2957.158017] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2957.158017] nova-compute[62208]: warnings.warn( [ 2957.163958] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fea40676-a38d-4860-b683-b57714583af4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2957.168082] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2957.168082] nova-compute[62208]: warnings.warn( [ 2957.179439] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e63db9f2-d0ea-4862-b368-bb3c20f429b5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2957.182021] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2957.182021] nova-compute[62208]: warnings.warn( [ 2957.186925] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bfe3bd9-6084-4f1c-944b-c24f9a3d57ce {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2957.190112] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2957.190112] nova-compute[62208]: warnings.warn( [ 2957.215784] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181950MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2957.215952] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2957.216148] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2957.359767] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 4d141090-57cf-442a-a03e-6151d29f2266 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2957.359947] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 20e9ed05-3592-4e84-8806-0e30c7563b85 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2957.360165] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2957.360310] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2957.377998] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing inventories for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2957.393002] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating ProviderTree inventory for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2957.393209] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Updating inventory in ProviderTree for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2957.405454] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing aggregate associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, aggregates: None {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2957.423167] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Refreshing trait associations for resource provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE {{(pid=62208) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2957.464162] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f9f178c-d6aa-4bfe-9d28-020ed73f5ecd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2957.466670] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2957.466670] nova-compute[62208]: warnings.warn( [ 2957.472298] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e9f1653-160a-4dcb-8b25-d6995d25800e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2957.475308] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2957.475308] nova-compute[62208]: warnings.warn( [ 2957.504667] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d1abe8c-81ab-4313-a89e-1d4692729a19 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2957.507145] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2957.507145] nova-compute[62208]: warnings.warn( [ 2957.513038] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb31a5d5-f611-4127-95ef-946458be4274 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2957.516825] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2957.516825] nova-compute[62208]: warnings.warn( [ 2957.527444] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2957.536434] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2957.552726] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2957.552870] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2962.546902] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2964.141717] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2964.142113] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances with incomplete migration {{(pid=62208) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11257}} [ 2967.150165] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2967.150574] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Cleaning up deleted instances {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11219}} [ 2967.160349] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] There are 0 instances to clean {{(pid=62208) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11228}} [ 2972.424511] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2972.424511] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2972.424511] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2972.424511] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2972.424511] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2972.424511] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 2972.424511] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2972.424511] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2972.424511] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2972.424511] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2972.424511] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2972.424511] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 2972.425042] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/2db4608d-3dcd-467b-8c5b-df085e6198b4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2972.427034] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2972.427279] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Copying Virtual Disk [datastore2] vmware_temp/2db4608d-3dcd-467b-8c5b-df085e6198b4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/2db4608d-3dcd-467b-8c5b-df085e6198b4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2972.427587] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ee2fb7cd-87f8-41de-b99e-745866601cb7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2972.429844] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2972.429844] nova-compute[62208]: warnings.warn( [ 2972.435819] nova-compute[62208]: DEBUG oslo_vmware.api [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2972.435819] nova-compute[62208]: value = "task-38749" [ 2972.435819] nova-compute[62208]: _type = "Task" [ 2972.435819] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2972.439760] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2972.439760] nova-compute[62208]: warnings.warn( [ 2972.444734] nova-compute[62208]: DEBUG oslo_vmware.api [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38749, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2972.940839] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2972.940839] nova-compute[62208]: warnings.warn( [ 2972.947102] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2972.947384] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2972.947939] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Traceback (most recent call last): [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] yield resources [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] self.driver.spawn(context, instance, image_meta, [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] self._fetch_image_if_missing(context, vi) [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] image_cache(vi, tmp_image_ds_loc) [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] vm_util.copy_virtual_disk( [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] session._wait_for_task(vmdk_copy_task) [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] return self.wait_for_task(task_ref) [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] return evt.wait() [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] result = hub.switch() [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] return self.greenlet.switch() [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] self.f(*self.args, **self.kw) [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] raise exceptions.translate_fault(task_info.error) [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Faults: ['InvalidArgument'] [ 2972.947939] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] [ 2972.949312] nova-compute[62208]: INFO nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Terminating instance [ 2972.949890] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2972.950088] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2972.950316] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-396489c9-ddb0-4990-b7ca-4a8d776df7ce {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2972.952510] nova-compute[62208]: DEBUG nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 2972.952699] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2972.953398] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17eb549b-a7c1-490d-ad22-0dcfa3da5dcd {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2972.956740] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2972.956740] nova-compute[62208]: warnings.warn( [ 2972.957020] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2972.957020] nova-compute[62208]: warnings.warn( [ 2972.961551] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2972.961768] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7a0af2c8-ef3d-4a17-ac29-4259dedcd34d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2972.963896] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2972.964088] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2972.964642] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2972.964642] nova-compute[62208]: warnings.warn( [ 2972.965019] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e403d615-f423-4c08-bc0d-5666581d3749 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2972.966913] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2972.966913] nova-compute[62208]: warnings.warn( [ 2972.969665] nova-compute[62208]: DEBUG oslo_vmware.api [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Waiting for the task: (returnval){ [ 2972.969665] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]527d16a2-91f4-a442-8208-78154213a0ad" [ 2972.969665] nova-compute[62208]: _type = "Task" [ 2972.969665] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2972.972281] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2972.972281] nova-compute[62208]: warnings.warn( [ 2972.976831] nova-compute[62208]: DEBUG oslo_vmware.api [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]527d16a2-91f4-a442-8208-78154213a0ad, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2973.030193] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2973.030405] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2973.030591] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleting the datastore file [datastore2] 4d141090-57cf-442a-a03e-6151d29f2266 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2973.030868] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1845bdd4-12f0-4243-9e9e-ba1d0a2825d8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2973.032704] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.032704] nova-compute[62208]: warnings.warn( [ 2973.037660] nova-compute[62208]: DEBUG oslo_vmware.api [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2973.037660] nova-compute[62208]: value = "task-38751" [ 2973.037660] nova-compute[62208]: _type = "Task" [ 2973.037660] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2973.040637] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.040637] nova-compute[62208]: warnings.warn( [ 2973.045337] nova-compute[62208]: DEBUG oslo_vmware.api [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38751, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2973.474338] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.474338] nova-compute[62208]: warnings.warn( [ 2973.480562] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2973.480838] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Creating directory with path [datastore2] vmware_temp/f2f83562-11cc-4d69-9fcd-7cad7d9fa574/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2973.481155] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7d2a704f-7366-4058-a3fd-934992f34e9e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2973.483040] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.483040] nova-compute[62208]: warnings.warn( [ 2973.492901] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Created directory with path [datastore2] vmware_temp/f2f83562-11cc-4d69-9fcd-7cad7d9fa574/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2973.493098] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Fetch image to [datastore2] vmware_temp/f2f83562-11cc-4d69-9fcd-7cad7d9fa574/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2973.493270] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/f2f83562-11cc-4d69-9fcd-7cad7d9fa574/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2973.493984] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe3c3bac-202c-40fe-8e22-6362f2b02571 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2973.496223] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.496223] nova-compute[62208]: warnings.warn( [ 2973.500471] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7095c9e4-81f9-4527-96bc-cfa950a6faac {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2973.502663] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.502663] nova-compute[62208]: warnings.warn( [ 2973.509603] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5f30c48-b353-4b16-8c35-ea0a7ae0c4c3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2973.513106] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.513106] nova-compute[62208]: warnings.warn( [ 2973.541904] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5d85d50-9da0-4634-b13f-7a17221b1d8c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2973.544073] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.544073] nova-compute[62208]: warnings.warn( [ 2973.544458] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.544458] nova-compute[62208]: warnings.warn( [ 2973.549338] nova-compute[62208]: DEBUG oslo_vmware.api [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38751, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07609} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2973.550775] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2973.550985] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2973.551168] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2973.551344] nova-compute[62208]: INFO nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2973.553172] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f5b974d8-4772-439b-b8e9-afcd9513df64 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2973.554992] nova-compute[62208]: DEBUG nova.compute.claims [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9372e67d0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2973.555167] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2973.555374] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2973.557793] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.557793] nova-compute[62208]: warnings.warn( [ 2973.577570] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2973.627965] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f2f83562-11cc-4d69-9fcd-7cad7d9fa574/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2973.684349] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6794fdf5-ff8e-442d-bae7-48b0336f0423 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2973.688213] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.688213] nova-compute[62208]: warnings.warn( [ 2973.689351] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2973.689498] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f2f83562-11cc-4d69-9fcd-7cad7d9fa574/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2973.695373] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d3ce4ed-b45c-493d-8b65-4fefdcd03280 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2973.698466] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.698466] nova-compute[62208]: warnings.warn( [ 2973.726560] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7091897f-ae29-4e48-a0aa-3b94699375d8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2973.729001] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.729001] nova-compute[62208]: warnings.warn( [ 2973.734189] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fe792cd-0d9b-45b0-9144-26a881a87625 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2973.738028] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2973.738028] nova-compute[62208]: warnings.warn( [ 2973.747684] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2973.755828] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2973.772288] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.217s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2973.772787] nova-compute[62208]: Faults: ['InvalidArgument'] [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Traceback (most recent call last): [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] self.driver.spawn(context, instance, image_meta, [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] self._fetch_image_if_missing(context, vi) [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] image_cache(vi, tmp_image_ds_loc) [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] vm_util.copy_virtual_disk( [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] session._wait_for_task(vmdk_copy_task) [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] return self.wait_for_task(task_ref) [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] return evt.wait() [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] result = hub.switch() [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] return self.greenlet.switch() [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] self.f(*self.args, **self.kw) [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] raise exceptions.translate_fault(task_info.error) [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Faults: ['InvalidArgument'] [ 2973.772787] nova-compute[62208]: ERROR nova.compute.manager [instance: 4d141090-57cf-442a-a03e-6151d29f2266] [ 2973.773693] nova-compute[62208]: DEBUG nova.compute.utils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2973.775734] nova-compute[62208]: DEBUG nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Build of instance 4d141090-57cf-442a-a03e-6151d29f2266 was re-scheduled: A specified parameter was not correct: fileType [ 2973.775734] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 2973.776212] nova-compute[62208]: DEBUG nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 2973.776394] nova-compute[62208]: DEBUG nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 2973.776615] nova-compute[62208]: DEBUG nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 2973.776793] nova-compute[62208]: DEBUG nova.network.neutron [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2974.002801] nova-compute[62208]: DEBUG nova.network.neutron [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2974.015737] nova-compute[62208]: INFO nova.compute.manager [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 4d141090-57cf-442a-a03e-6151d29f2266] Took 0.24 seconds to deallocate network for instance. [ 2974.114721] nova-compute[62208]: INFO nova.scheduler.client.report [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleted allocations for instance 4d141090-57cf-442a-a03e-6151d29f2266 [ 2974.133780] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-604118ba-21a9-4a9b-9bda-6c184136cfde tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "4d141090-57cf-442a-a03e-6151d29f2266" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 190.639s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2974.141047] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2975.526286] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "066392ef-3ddc-47a2-8397-813436e670c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2975.527091] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "066392ef-3ddc-47a2-8397-813436e670c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2975.539099] nova-compute[62208]: DEBUG nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 2975.592032] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2975.592432] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2975.594262] nova-compute[62208]: INFO nova.compute.claims [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2975.678724] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-348b347c-eafc-421b-b55d-7f5c26005471 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2975.681397] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2975.681397] nova-compute[62208]: warnings.warn( [ 2975.686813] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-894db1e2-704f-43ce-a609-40ca03ed078c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2975.689914] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2975.689914] nova-compute[62208]: warnings.warn( [ 2975.717027] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9cc2a2a-259d-400c-b02e-e812e3091c7d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2975.719331] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2975.719331] nova-compute[62208]: warnings.warn( [ 2975.724506] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9957c359-046a-4388-946b-807a5cb9118c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2975.728228] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2975.728228] nova-compute[62208]: warnings.warn( [ 2975.737999] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2975.747039] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2975.766938] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2975.767405] nova-compute[62208]: DEBUG nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 2975.804510] nova-compute[62208]: DEBUG nova.compute.utils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2975.806353] nova-compute[62208]: DEBUG nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2975.806579] nova-compute[62208]: DEBUG nova.network.neutron [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2975.819901] nova-compute[62208]: DEBUG nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 2975.855394] nova-compute[62208]: DEBUG nova.policy [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e8cb00a6413b46fcb17cbe532a0bffc5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '53b578fa6aa34a2d80eb9938d58ffe12', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 2975.896719] nova-compute[62208]: DEBUG nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 2975.919430] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2975.919705] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2975.919863] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2975.920059] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2975.920211] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2975.920361] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2975.920574] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2975.920849] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2975.921011] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2975.921180] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2975.921351] nova-compute[62208]: DEBUG nova.virt.hardware [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2975.922203] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3117292-3703-45ad-bece-a0f245aff776 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2975.925673] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2975.925673] nova-compute[62208]: warnings.warn( [ 2975.934038] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1d4d966-6685-4ebf-ae62-4a615dc3fa1d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2975.937562] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2975.937562] nova-compute[62208]: warnings.warn( [ 2976.132973] nova-compute[62208]: DEBUG nova.network.neutron [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Successfully created port: 687be120-fb80-4891-9bb9-93eb06f2b399 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2976.802738] nova-compute[62208]: DEBUG nova.compute.manager [req-f3b516c7-0a49-4e3f-b4fe-75f2349977fb req-69a3ba39-c325-43b0-9a8f-a9f5763d9d1b service nova] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Received event network-vif-plugged-687be120-fb80-4891-9bb9-93eb06f2b399 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2976.802994] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f3b516c7-0a49-4e3f-b4fe-75f2349977fb req-69a3ba39-c325-43b0-9a8f-a9f5763d9d1b service nova] Acquiring lock "066392ef-3ddc-47a2-8397-813436e670c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2976.803194] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f3b516c7-0a49-4e3f-b4fe-75f2349977fb req-69a3ba39-c325-43b0-9a8f-a9f5763d9d1b service nova] Lock "066392ef-3ddc-47a2-8397-813436e670c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2976.803358] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-f3b516c7-0a49-4e3f-b4fe-75f2349977fb req-69a3ba39-c325-43b0-9a8f-a9f5763d9d1b service nova] Lock "066392ef-3ddc-47a2-8397-813436e670c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2976.803521] nova-compute[62208]: DEBUG nova.compute.manager [req-f3b516c7-0a49-4e3f-b4fe-75f2349977fb req-69a3ba39-c325-43b0-9a8f-a9f5763d9d1b service nova] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] No waiting events found dispatching network-vif-plugged-687be120-fb80-4891-9bb9-93eb06f2b399 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2976.803680] nova-compute[62208]: WARNING nova.compute.manager [req-f3b516c7-0a49-4e3f-b4fe-75f2349977fb req-69a3ba39-c325-43b0-9a8f-a9f5763d9d1b service nova] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Received unexpected event network-vif-plugged-687be120-fb80-4891-9bb9-93eb06f2b399 for instance with vm_state building and task_state spawning. [ 2976.883486] nova-compute[62208]: DEBUG nova.network.neutron [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Successfully updated port: 687be120-fb80-4891-9bb9-93eb06f2b399 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2976.894266] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "refresh_cache-066392ef-3ddc-47a2-8397-813436e670c3" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2976.894436] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "refresh_cache-066392ef-3ddc-47a2-8397-813436e670c3" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2976.894585] nova-compute[62208]: DEBUG nova.network.neutron [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2976.962328] nova-compute[62208]: DEBUG nova.network.neutron [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2977.112385] nova-compute[62208]: DEBUG nova.network.neutron [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Updating instance_info_cache with network_info: [{"id": "687be120-fb80-4891-9bb9-93eb06f2b399", "address": "fa:16:3e:3b:ca:a0", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap687be120-fb", "ovs_interfaceid": "687be120-fb80-4891-9bb9-93eb06f2b399", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2977.125750] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "refresh_cache-066392ef-3ddc-47a2-8397-813436e670c3" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2977.126100] nova-compute[62208]: DEBUG nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Instance network_info: |[{"id": "687be120-fb80-4891-9bb9-93eb06f2b399", "address": "fa:16:3e:3b:ca:a0", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap687be120-fb", "ovs_interfaceid": "687be120-fb80-4891-9bb9-93eb06f2b399", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 2977.126549] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3b:ca:a0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4b5f9472-1844-4c99-8804-8f193cfff562', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '687be120-fb80-4891-9bb9-93eb06f2b399', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2977.133950] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2977.134452] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2977.134707] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-919a3141-78b8-452c-bc22-02517ab351d0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2977.148607] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2977.148607] nova-compute[62208]: warnings.warn( [ 2977.154829] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2977.154829] nova-compute[62208]: value = "task-38752" [ 2977.154829] nova-compute[62208]: _type = "Task" [ 2977.154829] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2977.158198] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2977.158198] nova-compute[62208]: warnings.warn( [ 2977.163347] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38752, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2977.659065] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2977.659065] nova-compute[62208]: warnings.warn( [ 2977.665479] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38752, 'name': CreateVM_Task, 'duration_secs': 0.29567} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2977.665663] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2977.666253] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2977.666490] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2977.669691] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b0e160a-5c39-447c-bb2d-d1155fa8e698 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2977.680897] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2977.680897] nova-compute[62208]: warnings.warn( [ 2977.698739] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Reconfiguring VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 2977.699059] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-f4538307-3db5-44de-a8f4-0d691f9ecaa4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2977.708925] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2977.708925] nova-compute[62208]: warnings.warn( [ 2977.715014] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2977.715014] nova-compute[62208]: value = "task-38753" [ 2977.715014] nova-compute[62208]: _type = "Task" [ 2977.715014] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2977.718088] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2977.718088] nova-compute[62208]: warnings.warn( [ 2977.723844] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38753, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2978.218915] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2978.218915] nova-compute[62208]: warnings.warn( [ 2978.224962] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38753, 'name': ReconfigVM_Task, 'duration_secs': 0.111946} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2978.225229] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Reconfigured VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 2978.225440] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.559s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2978.225685] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2978.225832] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2978.226162] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2978.226429] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bf31b69c-52e9-4664-b389-e542c1b1b685 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2978.227988] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2978.227988] nova-compute[62208]: warnings.warn( [ 2978.231444] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 2978.231444] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52183452-eb38-b8f6-e2b8-e06c38809d8b" [ 2978.231444] nova-compute[62208]: _type = "Task" [ 2978.231444] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2978.234591] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2978.234591] nova-compute[62208]: warnings.warn( [ 2978.239938] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52183452-eb38-b8f6-e2b8-e06c38809d8b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2978.735465] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 2978.735465] nova-compute[62208]: warnings.warn( [ 2978.743074] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2978.743331] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2978.743553] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2978.830184] nova-compute[62208]: DEBUG nova.compute.manager [req-6646340a-cdca-4a5c-bc14-5bc16e8ad95d req-ee89c425-eda9-48c1-8ca7-3d411f191218 service nova] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Received event network-changed-687be120-fb80-4891-9bb9-93eb06f2b399 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 2978.830390] nova-compute[62208]: DEBUG nova.compute.manager [req-6646340a-cdca-4a5c-bc14-5bc16e8ad95d req-ee89c425-eda9-48c1-8ca7-3d411f191218 service nova] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Refreshing instance network info cache due to event network-changed-687be120-fb80-4891-9bb9-93eb06f2b399. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 2978.830617] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-6646340a-cdca-4a5c-bc14-5bc16e8ad95d req-ee89c425-eda9-48c1-8ca7-3d411f191218 service nova] Acquiring lock "refresh_cache-066392ef-3ddc-47a2-8397-813436e670c3" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2978.830780] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-6646340a-cdca-4a5c-bc14-5bc16e8ad95d req-ee89c425-eda9-48c1-8ca7-3d411f191218 service nova] Acquired lock "refresh_cache-066392ef-3ddc-47a2-8397-813436e670c3" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2978.831219] nova-compute[62208]: DEBUG nova.network.neutron [req-6646340a-cdca-4a5c-bc14-5bc16e8ad95d req-ee89c425-eda9-48c1-8ca7-3d411f191218 service nova] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Refreshing network info cache for port 687be120-fb80-4891-9bb9-93eb06f2b399 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2979.056218] nova-compute[62208]: DEBUG nova.network.neutron [req-6646340a-cdca-4a5c-bc14-5bc16e8ad95d req-ee89c425-eda9-48c1-8ca7-3d411f191218 service nova] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Updated VIF entry in instance network info cache for port 687be120-fb80-4891-9bb9-93eb06f2b399. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2979.056627] nova-compute[62208]: DEBUG nova.network.neutron [req-6646340a-cdca-4a5c-bc14-5bc16e8ad95d req-ee89c425-eda9-48c1-8ca7-3d411f191218 service nova] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Updating instance_info_cache with network_info: [{"id": "687be120-fb80-4891-9bb9-93eb06f2b399", "address": "fa:16:3e:3b:ca:a0", "network": {"id": "c0329fb5-81ed-4fa9-8055-2a785eb26d8d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1430402323-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "53b578fa6aa34a2d80eb9938d58ffe12", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5f9472-1844-4c99-8804-8f193cfff562", "external-id": "nsx-vlan-transportzone-445", "segmentation_id": 445, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap687be120-fb", "ovs_interfaceid": "687be120-fb80-4891-9bb9-93eb06f2b399", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2979.065766] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-6646340a-cdca-4a5c-bc14-5bc16e8ad95d req-ee89c425-eda9-48c1-8ca7-3d411f191218 service nova] Releasing lock "refresh_cache-066392ef-3ddc-47a2-8397-813436e670c3" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3003.147603] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3004.142345] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3004.142345] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 3004.142345] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 3004.154343] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 3004.154539] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 3004.154707] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 3005.140785] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3008.136580] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3010.142929] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3010.143271] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3010.143334] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 3012.141593] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3013.141981] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3018.141231] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3018.151593] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3018.151803] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3018.151980] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3018.152153] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 3018.153571] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f970dee-ced0-4302-ae66-0fc159cf1cc6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3018.156387] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3018.156387] nova-compute[62208]: warnings.warn( [ 3018.162133] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d32087d0-2d0e-45af-a2e1-17b10e00e566 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3018.167015] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3018.167015] nova-compute[62208]: warnings.warn( [ 3018.177359] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d8435aa-ab63-459d-985d-c40b62522588 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3018.179483] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3018.179483] nova-compute[62208]: warnings.warn( [ 3018.183489] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-491c11c3-d81c-4663-b04e-2383f59524e3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3018.186297] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3018.186297] nova-compute[62208]: warnings.warn( [ 3018.211926] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181928MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 3018.212097] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3018.212295] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3018.253050] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 20e9ed05-3592-4e84-8806-0e30c7563b85 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 3018.253313] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 066392ef-3ddc-47a2-8397-813436e670c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 3018.253498] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 3018.253640] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 3018.292511] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-011f9fe4-8e6e-4fe6-8840-7e9c796e7995 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3018.295291] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3018.295291] nova-compute[62208]: warnings.warn( [ 3018.300744] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b145c53f-da98-4cdb-b2f0-2e03cb045fa2 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3018.304216] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3018.304216] nova-compute[62208]: warnings.warn( [ 3018.334496] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2deee2cf-f67c-4be8-8e03-23e5611976c0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3018.337279] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3018.337279] nova-compute[62208]: warnings.warn( [ 3018.343025] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f2894f1-d969-46ef-a91d-0e8994ca7c4b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3018.347166] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3018.347166] nova-compute[62208]: warnings.warn( [ 3018.357682] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 3018.368035] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 3018.385139] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 3018.385399] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3021.657201] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 3021.657201] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 3021.657201] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 3021.657201] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 3021.657201] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 3021.657201] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 3021.657201] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 3021.657201] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 3021.657201] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 3021.657201] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 3021.657201] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 3021.657201] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 3021.657807] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/f2f83562-11cc-4d69-9fcd-7cad7d9fa574/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 3021.659996] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 3021.659996] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Copying Virtual Disk [datastore2] vmware_temp/f2f83562-11cc-4d69-9fcd-7cad7d9fa574/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/f2f83562-11cc-4d69-9fcd-7cad7d9fa574/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 3021.660187] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e9d3cd4f-0c2b-44e4-a2f1-66d554a1e590 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3021.662513] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3021.662513] nova-compute[62208]: warnings.warn( [ 3021.671849] nova-compute[62208]: DEBUG oslo_vmware.api [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Waiting for the task: (returnval){ [ 3021.671849] nova-compute[62208]: value = "task-38754" [ 3021.671849] nova-compute[62208]: _type = "Task" [ 3021.671849] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3021.675145] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3021.675145] nova-compute[62208]: warnings.warn( [ 3021.680511] nova-compute[62208]: DEBUG oslo_vmware.api [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': task-38754, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3022.175871] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.175871] nova-compute[62208]: warnings.warn( [ 3022.182110] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 3022.182578] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3022.183259] nova-compute[62208]: Faults: ['InvalidArgument'] [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Traceback (most recent call last): [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] yield resources [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] self.driver.spawn(context, instance, image_meta, [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] self._vmops.spawn(context, instance, image_meta, injected_files, [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] self._fetch_image_if_missing(context, vi) [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] image_cache(vi, tmp_image_ds_loc) [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] vm_util.copy_virtual_disk( [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] session._wait_for_task(vmdk_copy_task) [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] return self.wait_for_task(task_ref) [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] return evt.wait() [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] result = hub.switch() [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] return self.greenlet.switch() [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] self.f(*self.args, **self.kw) [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] raise exceptions.translate_fault(task_info.error) [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Faults: ['InvalidArgument'] [ 3022.183259] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] [ 3022.184363] nova-compute[62208]: INFO nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Terminating instance [ 3022.185989] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 3022.186196] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 3022.186441] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fb494759-7c4c-46ad-a18e-e772ea8b4eee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.188704] nova-compute[62208]: DEBUG nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 3022.188926] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 3022.189674] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b726e36-28a4-48d7-82b8-80ee25faa34d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.192126] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.192126] nova-compute[62208]: warnings.warn( [ 3022.192464] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.192464] nova-compute[62208]: warnings.warn( [ 3022.196769] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 3022.197017] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2e55fbed-06d9-43a2-ab86-bb0868a4d3d1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.199811] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 3022.199988] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 3022.200567] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.200567] nova-compute[62208]: warnings.warn( [ 3022.200978] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-541764d2-6ac7-433a-9cb7-ca5431c78072 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.203504] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.203504] nova-compute[62208]: warnings.warn( [ 3022.207033] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 3022.207033] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]5255b917-6020-8991-c3ab-229407ffb636" [ 3022.207033] nova-compute[62208]: _type = "Task" [ 3022.207033] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3022.211543] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.211543] nova-compute[62208]: warnings.warn( [ 3022.223899] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 3022.224326] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating directory with path [datastore2] vmware_temp/71e18a7a-76b1-4eee-8e7e-ac890234d146/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 3022.224662] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-05c77007-7cc0-4deb-9477-d31d269db351 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.226748] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.226748] nova-compute[62208]: warnings.warn( [ 3022.248413] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Created directory with path [datastore2] vmware_temp/71e18a7a-76b1-4eee-8e7e-ac890234d146/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 3022.248662] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Fetch image to [datastore2] vmware_temp/71e18a7a-76b1-4eee-8e7e-ac890234d146/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 3022.248929] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/71e18a7a-76b1-4eee-8e7e-ac890234d146/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 3022.249686] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1119e56f-5d34-461b-895a-c6ac2459fca1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.252317] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.252317] nova-compute[62208]: warnings.warn( [ 3022.257088] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ffa7a06-7d24-4550-a5c7-945cde4c8fb1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.259409] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.259409] nova-compute[62208]: warnings.warn( [ 3022.266618] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01fa3978-e464-4748-8741-e1019220e0ee {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.270140] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.270140] nova-compute[62208]: warnings.warn( [ 3022.303369] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51eda542-02c3-43d7-a533-4553e576af49 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.306013] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 3022.306225] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 3022.306399] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Deleting the datastore file [datastore2] 20e9ed05-3592-4e84-8806-0e30c7563b85 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 3022.306638] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-885799c4-7c7a-4aec-a5b4-459f21189186 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.308094] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.308094] nova-compute[62208]: warnings.warn( [ 3022.308445] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.308445] nova-compute[62208]: warnings.warn( [ 3022.313335] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b9051af9-b22c-4e05-bb65-782ad03adc62 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.315123] nova-compute[62208]: DEBUG oslo_vmware.api [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Waiting for the task: (returnval){ [ 3022.315123] nova-compute[62208]: value = "task-38756" [ 3022.315123] nova-compute[62208]: _type = "Task" [ 3022.315123] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3022.315278] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.315278] nova-compute[62208]: warnings.warn( [ 3022.318291] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.318291] nova-compute[62208]: warnings.warn( [ 3022.323447] nova-compute[62208]: DEBUG oslo_vmware.api [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': task-38756, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3022.335937] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 3022.432291] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/71e18a7a-76b1-4eee-8e7e-ac890234d146/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 3022.487330] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 3022.487537] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/71e18a7a-76b1-4eee-8e7e-ac890234d146/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 3022.819633] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.819633] nova-compute[62208]: warnings.warn( [ 3022.826403] nova-compute[62208]: DEBUG oslo_vmware.api [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Task: {'id': task-38756, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068047} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 3022.826762] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 3022.827043] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 3022.827343] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 3022.827683] nova-compute[62208]: INFO nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Took 0.64 seconds to destroy the instance on the hypervisor. [ 3022.830108] nova-compute[62208]: DEBUG nova.compute.claims [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Aborting claim: <nova.compute.claims.Claim object at 0x7fb9365edc30> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 3022.830287] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3022.830502] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3022.912153] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0b55a68-10fa-4c3d-9dc8-d8d2229b7033 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.915000] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.915000] nova-compute[62208]: warnings.warn( [ 3022.920198] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-323e3a67-6663-44f7-bf50-1e0700078695 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.923028] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.923028] nova-compute[62208]: warnings.warn( [ 3022.952369] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc10b241-aee9-4f38-acfa-00311c76307f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.954726] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.954726] nova-compute[62208]: warnings.warn( [ 3022.959922] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d8475c8-7bc9-4e4b-b738-9ff90a688bea {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3022.963573] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3022.963573] nova-compute[62208]: warnings.warn( [ 3022.973997] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 3022.982633] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 3022.997799] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.167s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3022.998303] nova-compute[62208]: Faults: ['InvalidArgument'] [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Traceback (most recent call last): [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] self.driver.spawn(context, instance, image_meta, [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] self._vmops.spawn(context, instance, image_meta, injected_files, [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] self._fetch_image_if_missing(context, vi) [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] image_cache(vi, tmp_image_ds_loc) [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] vm_util.copy_virtual_disk( [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] session._wait_for_task(vmdk_copy_task) [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] return self.wait_for_task(task_ref) [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] return evt.wait() [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] result = hub.switch() [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] return self.greenlet.switch() [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] self.f(*self.args, **self.kw) [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] raise exceptions.translate_fault(task_info.error) [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Faults: ['InvalidArgument'] [ 3022.998303] nova-compute[62208]: ERROR nova.compute.manager [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] [ 3022.999801] nova-compute[62208]: DEBUG nova.compute.utils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 3023.000455] nova-compute[62208]: DEBUG nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Build of instance 20e9ed05-3592-4e84-8806-0e30c7563b85 was re-scheduled: A specified parameter was not correct: fileType [ 3023.000455] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 3023.000837] nova-compute[62208]: DEBUG nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 3023.001009] nova-compute[62208]: DEBUG nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 3023.001179] nova-compute[62208]: DEBUG nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 3023.001343] nova-compute[62208]: DEBUG nova.network.neutron [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 3023.264546] nova-compute[62208]: DEBUG nova.network.neutron [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 3023.277670] nova-compute[62208]: INFO nova.compute.manager [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] [instance: 20e9ed05-3592-4e84-8806-0e30c7563b85] Took 0.28 seconds to deallocate network for instance. [ 3023.372210] nova-compute[62208]: INFO nova.scheduler.client.report [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Deleted allocations for instance 20e9ed05-3592-4e84-8806-0e30c7563b85 [ 3023.393342] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-03746be7-0351-4bf6-a2b8-d1f9be2571e9 tempest-AttachVolumeShelveTestJSON-216431597 tempest-AttachVolumeShelveTestJSON-216431597-project-member] Lock "20e9ed05-3592-4e84-8806-0e30c7563b85" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 193.129s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3028.606201] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_power_states {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3028.616570] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Getting list of instances from cluster (obj){ [ 3028.616570] nova-compute[62208]: value = "domain-c8" [ 3028.616570] nova-compute[62208]: _type = "ClusterComputeResource" [ 3028.616570] nova-compute[62208]: } {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 3028.618201] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-100b3eea-b10f-4da3-94dc-d65862809585 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3028.621865] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3028.621865] nova-compute[62208]: warnings.warn( [ 3028.630198] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Got total of 1 instances {{(pid=62208) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 3028.630387] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Triggering sync for uuid 066392ef-3ddc-47a2-8397-813436e670c3 {{(pid=62208) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10342}} [ 3028.630735] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "066392ef-3ddc-47a2-8397-813436e670c3" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3030.377074] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "edf06ba6-b54c-4ecd-8dab-2f7757ec2e68" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3030.377399] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "edf06ba6-b54c-4ecd-8dab-2f7757ec2e68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3030.389038] nova-compute[62208]: DEBUG nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 3030.444606] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3030.444853] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3030.446375] nova-compute[62208]: INFO nova.compute.claims [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 3030.531587] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03322f9d-7d03-4a27-a3d5-8011f36ce6b7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3030.534307] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3030.534307] nova-compute[62208]: warnings.warn( [ 3030.539465] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d518ce2a-84ae-4630-a5ba-a544654d4bec {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3030.542427] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3030.542427] nova-compute[62208]: warnings.warn( [ 3030.571375] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d847fe77-2f4c-48dc-bdce-9c14ea8a8b98 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3030.573723] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3030.573723] nova-compute[62208]: warnings.warn( [ 3030.578820] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-faf2da8a-4862-4faf-990e-187b826a7e0c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3030.582491] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3030.582491] nova-compute[62208]: warnings.warn( [ 3030.592981] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 3030.601429] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 3030.615657] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3030.616120] nova-compute[62208]: DEBUG nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 3030.651475] nova-compute[62208]: DEBUG nova.compute.utils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 3030.653021] nova-compute[62208]: DEBUG nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 3030.653184] nova-compute[62208]: DEBUG nova.network.neutron [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 3030.670069] nova-compute[62208]: DEBUG nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 3030.708342] nova-compute[62208]: DEBUG nova.policy [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f5b1c4d5f18f41d6abdba4aa0c25c4d8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a1485ca2a9104710b083712d1c2db0a3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 3030.744988] nova-compute[62208]: DEBUG nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 3030.767951] nova-compute[62208]: DEBUG nova.virt.hardware [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 3030.768301] nova-compute[62208]: DEBUG nova.virt.hardware [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 3030.768515] nova-compute[62208]: DEBUG nova.virt.hardware [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 3030.768802] nova-compute[62208]: DEBUG nova.virt.hardware [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 3030.769113] nova-compute[62208]: DEBUG nova.virt.hardware [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 3030.769342] nova-compute[62208]: DEBUG nova.virt.hardware [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 3030.769855] nova-compute[62208]: DEBUG nova.virt.hardware [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 3030.770084] nova-compute[62208]: DEBUG nova.virt.hardware [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 3030.770318] nova-compute[62208]: DEBUG nova.virt.hardware [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 3030.770539] nova-compute[62208]: DEBUG nova.virt.hardware [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 3030.770774] nova-compute[62208]: DEBUG nova.virt.hardware [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 3030.771677] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd4270f3-5206-45a9-8859-3bc6d0e7505f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3030.774315] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3030.774315] nova-compute[62208]: warnings.warn( [ 3030.780693] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60223d14-fab3-42a6-ac42-2e069335ea98 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3030.784651] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3030.784651] nova-compute[62208]: warnings.warn( [ 3031.019140] nova-compute[62208]: DEBUG nova.network.neutron [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Successfully created port: 4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 3031.707589] nova-compute[62208]: DEBUG nova.compute.manager [req-3cc62d48-4b53-48df-bb58-ce6e04d1407a req-9da06e99-af63-49a9-88d7-1222bd38001f service nova] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Received event network-vif-plugged-4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 3031.707895] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-3cc62d48-4b53-48df-bb58-ce6e04d1407a req-9da06e99-af63-49a9-88d7-1222bd38001f service nova] Acquiring lock "edf06ba6-b54c-4ecd-8dab-2f7757ec2e68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3031.708218] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-3cc62d48-4b53-48df-bb58-ce6e04d1407a req-9da06e99-af63-49a9-88d7-1222bd38001f service nova] Lock "edf06ba6-b54c-4ecd-8dab-2f7757ec2e68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3031.708218] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-3cc62d48-4b53-48df-bb58-ce6e04d1407a req-9da06e99-af63-49a9-88d7-1222bd38001f service nova] Lock "edf06ba6-b54c-4ecd-8dab-2f7757ec2e68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3031.708365] nova-compute[62208]: DEBUG nova.compute.manager [req-3cc62d48-4b53-48df-bb58-ce6e04d1407a req-9da06e99-af63-49a9-88d7-1222bd38001f service nova] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] No waiting events found dispatching network-vif-plugged-4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 3031.708540] nova-compute[62208]: WARNING nova.compute.manager [req-3cc62d48-4b53-48df-bb58-ce6e04d1407a req-9da06e99-af63-49a9-88d7-1222bd38001f service nova] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Received unexpected event network-vif-plugged-4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4 for instance with vm_state building and task_state spawning. [ 3031.782746] nova-compute[62208]: DEBUG nova.network.neutron [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Successfully updated port: 4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 3031.795926] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "refresh_cache-edf06ba6-b54c-4ecd-8dab-2f7757ec2e68" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 3031.796177] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquired lock "refresh_cache-edf06ba6-b54c-4ecd-8dab-2f7757ec2e68" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 3031.796392] nova-compute[62208]: DEBUG nova.network.neutron [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 3031.847758] nova-compute[62208]: DEBUG nova.network.neutron [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 3032.008997] nova-compute[62208]: DEBUG nova.network.neutron [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Updating instance_info_cache with network_info: [{"id": "4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4", "address": "fa:16:3e:46:2e:94", "network": {"id": "88eb8543-19a3-430b-8862-5d8f5e5d3f26", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2001032116-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a1485ca2a9104710b083712d1c2db0a3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39cd75b0-9ec7-48ed-b57f-34da0c573a60", "external-id": "nsx-vlan-transportzone-751", "segmentation_id": 751, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4b3ea5c6-85", "ovs_interfaceid": "4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 3032.024562] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Releasing lock "refresh_cache-edf06ba6-b54c-4ecd-8dab-2f7757ec2e68" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3032.025076] nova-compute[62208]: DEBUG nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Instance network_info: |[{"id": "4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4", "address": "fa:16:3e:46:2e:94", "network": {"id": "88eb8543-19a3-430b-8862-5d8f5e5d3f26", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2001032116-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a1485ca2a9104710b083712d1c2db0a3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39cd75b0-9ec7-48ed-b57f-34da0c573a60", "external-id": "nsx-vlan-transportzone-751", "segmentation_id": 751, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4b3ea5c6-85", "ovs_interfaceid": "4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 3032.025893] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:46:2e:94', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '39cd75b0-9ec7-48ed-b57f-34da0c573a60', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 3032.033902] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Creating folder: Project (a1485ca2a9104710b083712d1c2db0a3). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 3032.034631] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9a718ddd-9606-4060-9573-f9b68e042118 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3032.036912] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3032.036912] nova-compute[62208]: warnings.warn( [ 3032.046813] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Created folder: Project (a1485ca2a9104710b083712d1c2db0a3) in parent group-v17427. [ 3032.047060] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Creating folder: Instances. Parent ref: group-v17581. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 3032.047638] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f9c7ba02-43c1-46d2-a52a-593f37a86f2b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3032.049353] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3032.049353] nova-compute[62208]: warnings.warn( [ 3032.056434] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Created folder: Instances in parent group-v17581. [ 3032.057581] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 3032.057581] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 3032.057581] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c0ead62d-cffa-40b6-8a09-d32e8f24d0af {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3032.074281] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3032.074281] nova-compute[62208]: warnings.warn( [ 3032.080104] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 3032.080104] nova-compute[62208]: value = "task-38759" [ 3032.080104] nova-compute[62208]: _type = "Task" [ 3032.080104] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3032.083308] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3032.083308] nova-compute[62208]: warnings.warn( [ 3032.088712] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38759, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3032.584565] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3032.584565] nova-compute[62208]: warnings.warn( [ 3032.590588] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38759, 'name': CreateVM_Task, 'duration_secs': 0.30389} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 3032.591391] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 3032.592063] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3032.592310] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3032.595082] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-422af4d0-67f4-4402-a6f6-02fe8228926c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3032.605504] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3032.605504] nova-compute[62208]: warnings.warn( [ 3032.623106] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Reconfiguring VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 3032.623390] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-effd5508-7d6f-42dc-922f-ffd2dea935a3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3032.634059] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3032.634059] nova-compute[62208]: warnings.warn( [ 3032.639263] nova-compute[62208]: DEBUG oslo_vmware.api [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Waiting for the task: (returnval){ [ 3032.639263] nova-compute[62208]: value = "task-38760" [ 3032.639263] nova-compute[62208]: _type = "Task" [ 3032.639263] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3032.642928] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3032.642928] nova-compute[62208]: warnings.warn( [ 3032.647901] nova-compute[62208]: DEBUG oslo_vmware.api [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': task-38760, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3033.143137] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3033.143137] nova-compute[62208]: warnings.warn( [ 3033.149655] nova-compute[62208]: DEBUG oslo_vmware.api [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': task-38760, 'name': ReconfigVM_Task, 'duration_secs': 0.103912} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 3033.149948] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Reconfigured VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 3033.150173] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.558s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3033.150423] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 3033.150574] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 3033.150893] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 3033.151148] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-21369d45-58fa-4373-a694-0d4f4012f2b6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3033.152676] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3033.152676] nova-compute[62208]: warnings.warn( [ 3033.155734] nova-compute[62208]: DEBUG oslo_vmware.api [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Waiting for the task: (returnval){ [ 3033.155734] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52b4a180-18fc-6d4a-9419-b48a9a838da0" [ 3033.155734] nova-compute[62208]: _type = "Task" [ 3033.155734] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3033.158630] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3033.158630] nova-compute[62208]: warnings.warn( [ 3033.163423] nova-compute[62208]: DEBUG oslo_vmware.api [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52b4a180-18fc-6d4a-9419-b48a9a838da0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3033.659708] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3033.659708] nova-compute[62208]: warnings.warn( [ 3033.665807] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3033.666052] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 3033.666257] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 3033.733095] nova-compute[62208]: DEBUG nova.compute.manager [req-aee8a0ea-245c-49b7-9d82-e3923e3ffc67 req-02d239f0-fc2b-4f91-9eb7-cdbf0833d889 service nova] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Received event network-changed-4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 3033.733335] nova-compute[62208]: DEBUG nova.compute.manager [req-aee8a0ea-245c-49b7-9d82-e3923e3ffc67 req-02d239f0-fc2b-4f91-9eb7-cdbf0833d889 service nova] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Refreshing instance network info cache due to event network-changed-4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 3033.733547] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-aee8a0ea-245c-49b7-9d82-e3923e3ffc67 req-02d239f0-fc2b-4f91-9eb7-cdbf0833d889 service nova] Acquiring lock "refresh_cache-edf06ba6-b54c-4ecd-8dab-2f7757ec2e68" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 3033.733730] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-aee8a0ea-245c-49b7-9d82-e3923e3ffc67 req-02d239f0-fc2b-4f91-9eb7-cdbf0833d889 service nova] Acquired lock "refresh_cache-edf06ba6-b54c-4ecd-8dab-2f7757ec2e68" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 3033.733896] nova-compute[62208]: DEBUG nova.network.neutron [req-aee8a0ea-245c-49b7-9d82-e3923e3ffc67 req-02d239f0-fc2b-4f91-9eb7-cdbf0833d889 service nova] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Refreshing network info cache for port 4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 3033.951361] nova-compute[62208]: DEBUG nova.network.neutron [req-aee8a0ea-245c-49b7-9d82-e3923e3ffc67 req-02d239f0-fc2b-4f91-9eb7-cdbf0833d889 service nova] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Updated VIF entry in instance network info cache for port 4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 3033.951702] nova-compute[62208]: DEBUG nova.network.neutron [req-aee8a0ea-245c-49b7-9d82-e3923e3ffc67 req-02d239f0-fc2b-4f91-9eb7-cdbf0833d889 service nova] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Updating instance_info_cache with network_info: [{"id": "4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4", "address": "fa:16:3e:46:2e:94", "network": {"id": "88eb8543-19a3-430b-8862-5d8f5e5d3f26", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2001032116-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a1485ca2a9104710b083712d1c2db0a3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39cd75b0-9ec7-48ed-b57f-34da0c573a60", "external-id": "nsx-vlan-transportzone-751", "segmentation_id": 751, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4b3ea5c6-85", "ovs_interfaceid": "4b3ea5c6-85a5-4d1f-9ef1-dc3d9d5077e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 3033.961683] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-aee8a0ea-245c-49b7-9d82-e3923e3ffc67 req-02d239f0-fc2b-4f91-9eb7-cdbf0833d889 service nova] Releasing lock "refresh_cache-edf06ba6-b54c-4ecd-8dab-2f7757ec2e68" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3064.167274] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3064.167607] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 3064.167607] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 3064.179098] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 3064.179348] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 3064.179502] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 3064.180036] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3065.141164] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3069.136335] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3071.140655] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3071.143455] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 3071.143455] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 3071.143455] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 3071.143455] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 3071.143455] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 3071.143455] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 3071.143455] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 3071.143455] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 3071.143455] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 3071.143455] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 3071.143455] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 3071.143455] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 3071.144100] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/71e18a7a-76b1-4eee-8e7e-ac890234d146/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 3071.145906] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 3071.146181] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Copying Virtual Disk [datastore2] vmware_temp/71e18a7a-76b1-4eee-8e7e-ac890234d146/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/71e18a7a-76b1-4eee-8e7e-ac890234d146/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 3071.146467] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f6e3cfad-84ec-42cc-b87f-226ee9b4c829 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3071.149003] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3071.149003] nova-compute[62208]: warnings.warn( [ 3071.156252] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 3071.156252] nova-compute[62208]: value = "task-38761" [ 3071.156252] nova-compute[62208]: _type = "Task" [ 3071.156252] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3071.159254] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3071.159254] nova-compute[62208]: warnings.warn( [ 3071.164243] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38761, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3071.661590] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3071.661590] nova-compute[62208]: warnings.warn( [ 3071.667361] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 3071.667760] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3071.668424] nova-compute[62208]: Faults: ['InvalidArgument'] [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Traceback (most recent call last): [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] yield resources [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] self.driver.spawn(context, instance, image_meta, [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] self._fetch_image_if_missing(context, vi) [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] image_cache(vi, tmp_image_ds_loc) [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] vm_util.copy_virtual_disk( [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] session._wait_for_task(vmdk_copy_task) [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] return self.wait_for_task(task_ref) [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] return evt.wait() [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] result = hub.switch() [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] return self.greenlet.switch() [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] self.f(*self.args, **self.kw) [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] raise exceptions.translate_fault(task_info.error) [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Faults: ['InvalidArgument'] [ 3071.668424] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] [ 3071.670642] nova-compute[62208]: INFO nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Terminating instance [ 3071.670974] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 3071.671241] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 3071.671530] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-238812eb-febf-4c86-81b6-23e264dda97d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3071.673707] nova-compute[62208]: DEBUG nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 3071.673965] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 3071.674719] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52493ce9-ffd2-48dd-932e-a845d0c3a909 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3071.677209] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3071.677209] nova-compute[62208]: warnings.warn( [ 3071.677611] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3071.677611] nova-compute[62208]: warnings.warn( [ 3071.681937] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 3071.682204] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8c0362d7-ecdc-4789-b41a-75470a6966f9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3071.684438] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 3071.684670] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 3071.685272] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3071.685272] nova-compute[62208]: warnings.warn( [ 3071.685751] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0656e706-2f57-4efc-a910-7d7e59ec6146 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3071.687565] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3071.687565] nova-compute[62208]: warnings.warn( [ 3071.690622] nova-compute[62208]: DEBUG oslo_vmware.api [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Waiting for the task: (returnval){ [ 3071.690622] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]525c4ad1-0c91-73b6-0388-25459c1a5759" [ 3071.690622] nova-compute[62208]: _type = "Task" [ 3071.690622] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3071.693756] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3071.693756] nova-compute[62208]: warnings.warn( [ 3071.698165] nova-compute[62208]: DEBUG oslo_vmware.api [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]525c4ad1-0c91-73b6-0388-25459c1a5759, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3071.750025] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 3071.750421] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 3071.750724] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleting the datastore file [datastore2] 066392ef-3ddc-47a2-8397-813436e670c3 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 3071.751022] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fd30f508-b951-4353-8b71-de2fd6ca1e21 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3071.752963] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3071.752963] nova-compute[62208]: warnings.warn( [ 3071.757704] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Waiting for the task: (returnval){ [ 3071.757704] nova-compute[62208]: value = "task-38763" [ 3071.757704] nova-compute[62208]: _type = "Task" [ 3071.757704] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3071.760754] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3071.760754] nova-compute[62208]: warnings.warn( [ 3071.765490] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38763, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3072.141533] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3072.141871] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3072.141946] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 3072.195151] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3072.195151] nova-compute[62208]: warnings.warn( [ 3072.201378] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 3072.201643] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Creating directory with path [datastore2] vmware_temp/1f46a32d-29cd-442c-9da5-b58f56310cd3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 3072.201883] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-21fb70b0-a8aa-4634-8d14-20e75f640ce9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3072.204726] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3072.204726] nova-compute[62208]: warnings.warn( [ 3072.214562] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Created directory with path [datastore2] vmware_temp/1f46a32d-29cd-442c-9da5-b58f56310cd3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 3072.214769] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Fetch image to [datastore2] vmware_temp/1f46a32d-29cd-442c-9da5-b58f56310cd3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 3072.214939] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/1f46a32d-29cd-442c-9da5-b58f56310cd3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 3072.215706] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcc124ca-b255-4f34-b4c2-be4a092e4f9d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3072.218128] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3072.218128] nova-compute[62208]: warnings.warn( [ 3072.222786] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0823e1fa-50e1-44fa-912d-574a87f559b6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3072.224987] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3072.224987] nova-compute[62208]: warnings.warn( [ 3072.231709] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15c33c9c-4dc5-476b-bea6-524f6f2ba284 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3072.235184] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3072.235184] nova-compute[62208]: warnings.warn( [ 3072.264547] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca35dff5-08f9-4fe9-b309-c1d911dfd29d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3072.266621] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3072.266621] nova-compute[62208]: warnings.warn( [ 3072.266959] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3072.266959] nova-compute[62208]: warnings.warn( [ 3072.271556] nova-compute[62208]: DEBUG oslo_vmware.api [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Task: {'id': task-38763, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069093} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 3072.273019] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 3072.273218] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 3072.273390] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 3072.273569] nova-compute[62208]: INFO nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 3072.275333] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8230fc9e-3b42-4322-9fec-725e97b7f808 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3072.277177] nova-compute[62208]: DEBUG nova.compute.claims [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Aborting claim: <nova.compute.claims.Claim object at 0x7fb936105450> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 3072.277352] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3072.277562] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3072.279928] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3072.279928] nova-compute[62208]: warnings.warn( [ 3072.300334] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 3072.352256] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1f46a32d-29cd-442c-9da5-b58f56310cd3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 3072.407432] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b31e2cf-0e03-45a7-a6ee-4e2ac9e1b376 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3072.411116] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3072.411116] nova-compute[62208]: warnings.warn( [ 3072.412304] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 3072.412490] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1f46a32d-29cd-442c-9da5-b58f56310cd3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 3072.416303] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-242be751-86eb-4887-9536-170e371baf96 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3072.419620] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3072.419620] nova-compute[62208]: warnings.warn( [ 3072.448955] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18ec99f4-0c98-4777-855a-08800b5cdc3f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3072.451478] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3072.451478] nova-compute[62208]: warnings.warn( [ 3072.456653] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10dc41ef-333c-4cf6-a73b-c43347ea4245 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3072.460318] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3072.460318] nova-compute[62208]: warnings.warn( [ 3072.469990] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 3072.478517] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 3072.493816] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.216s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3072.494348] nova-compute[62208]: Faults: ['InvalidArgument'] [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Traceback (most recent call last): [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] self.driver.spawn(context, instance, image_meta, [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] self._fetch_image_if_missing(context, vi) [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] image_cache(vi, tmp_image_ds_loc) [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] vm_util.copy_virtual_disk( [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] session._wait_for_task(vmdk_copy_task) [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] return self.wait_for_task(task_ref) [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] return evt.wait() [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] result = hub.switch() [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] return self.greenlet.switch() [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] self.f(*self.args, **self.kw) [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] raise exceptions.translate_fault(task_info.error) [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Faults: ['InvalidArgument'] [ 3072.494348] nova-compute[62208]: ERROR nova.compute.manager [instance: 066392ef-3ddc-47a2-8397-813436e670c3] [ 3072.495046] nova-compute[62208]: DEBUG nova.compute.utils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 3072.496459] nova-compute[62208]: DEBUG nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Build of instance 066392ef-3ddc-47a2-8397-813436e670c3 was re-scheduled: A specified parameter was not correct: fileType [ 3072.496459] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 3072.496892] nova-compute[62208]: DEBUG nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 3072.497093] nova-compute[62208]: DEBUG nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 3072.497276] nova-compute[62208]: DEBUG nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 3072.497441] nova-compute[62208]: DEBUG nova.network.neutron [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 3072.755723] nova-compute[62208]: DEBUG nova.network.neutron [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 3072.768671] nova-compute[62208]: INFO nova.compute.manager [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] Took 0.27 seconds to deallocate network for instance. [ 3072.862065] nova-compute[62208]: INFO nova.scheduler.client.report [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Deleted allocations for instance 066392ef-3ddc-47a2-8397-813436e670c3 [ 3072.895855] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-8f3354fa-3b9e-4789-a1a5-9631c69eff8f tempest-DeleteServersTestJSON-2066579215 tempest-DeleteServersTestJSON-2066579215-project-member] Lock "066392ef-3ddc-47a2-8397-813436e670c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 97.368s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3072.895855] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "066392ef-3ddc-47a2-8397-813436e670c3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 44.264s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3072.895855] nova-compute[62208]: INFO nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 066392ef-3ddc-47a2-8397-813436e670c3] During sync_power_state the instance has a pending task (spawning). Skip. [ 3072.895855] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "066392ef-3ddc-47a2-8397-813436e670c3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3073.142546] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3075.849874] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Acquiring lock "313c553e-5e99-47b7-9ab3-7517df11aa15" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3075.850261] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Lock "313c553e-5e99-47b7-9ab3-7517df11aa15" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3075.864052] nova-compute[62208]: DEBUG nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 3075.917078] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3075.917338] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3075.919057] nova-compute[62208]: INFO nova.compute.claims [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 3076.005758] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b6e4cf7-6ec5-4a9f-9a40-edbde1181342 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3076.008349] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.008349] nova-compute[62208]: warnings.warn( [ 3076.014445] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e7cf308-e4ce-44bd-be3c-a399c32d73da {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3076.018775] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.018775] nova-compute[62208]: warnings.warn( [ 3076.045998] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a12ee0b7-9d65-446f-8614-3562810dfa29 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3076.048406] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.048406] nova-compute[62208]: warnings.warn( [ 3076.054050] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-171ff434-3607-49c5-b7e0-8fb699275345 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3076.057807] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.057807] nova-compute[62208]: warnings.warn( [ 3076.067774] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 3076.079063] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 3076.099984] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.182s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3076.100518] nova-compute[62208]: DEBUG nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 3076.136863] nova-compute[62208]: DEBUG nova.compute.utils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 3076.138150] nova-compute[62208]: DEBUG nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Not allocating networking since 'none' was specified. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1968}} [ 3076.149764] nova-compute[62208]: DEBUG nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 3076.218164] nova-compute[62208]: DEBUG nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 3076.240046] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 3076.240318] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 3076.240474] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 3076.240659] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 3076.240810] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 3076.240998] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 3076.241224] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 3076.241386] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 3076.241555] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 3076.241722] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 3076.241897] nova-compute[62208]: DEBUG nova.virt.hardware [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 3076.242780] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1db3d8ed-33b7-4e86-9856-59782ea24a20 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3076.245613] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.245613] nova-compute[62208]: warnings.warn( [ 3076.252395] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddc1cca0-b955-4423-86a3-7b6fbf44ca6f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3076.258176] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.258176] nova-compute[62208]: warnings.warn( [ 3076.271433] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Instance VIF info [] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 3076.276975] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Creating folder: Project (69f250761ea64f07bc31c549493f55b5). Parent ref: group-v17427. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 3076.277275] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-70a42c0b-22ca-4d9b-a9c7-1792cbea1a2b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3076.278874] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.278874] nova-compute[62208]: warnings.warn( [ 3076.287600] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Created folder: Project (69f250761ea64f07bc31c549493f55b5) in parent group-v17427. [ 3076.287784] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Creating folder: Instances. Parent ref: group-v17584. {{(pid=62208) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 3076.288020] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0c92f891-b21b-41da-a030-6618db80d0d4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3076.289527] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.289527] nova-compute[62208]: warnings.warn( [ 3076.296758] nova-compute[62208]: INFO nova.virt.vmwareapi.vm_util [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Created folder: Instances in parent group-v17584. [ 3076.296959] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 3076.297147] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 3076.297346] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7efe51a4-39c7-4299-b589-81ff5c7daede {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3076.310179] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.310179] nova-compute[62208]: warnings.warn( [ 3076.315181] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 3076.315181] nova-compute[62208]: value = "task-38766" [ 3076.315181] nova-compute[62208]: _type = "Task" [ 3076.315181] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3076.318002] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.318002] nova-compute[62208]: warnings.warn( [ 3076.322678] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38766, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3076.819539] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.819539] nova-compute[62208]: warnings.warn( [ 3076.825635] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38766, 'name': CreateVM_Task, 'duration_secs': 0.257443} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 3076.826001] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 3076.826464] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3076.826950] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3076.829912] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10d5cf1c-1630-41d9-bc1c-6243c820cae0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3076.840073] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.840073] nova-compute[62208]: warnings.warn( [ 3076.857813] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Reconfiguring VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 3076.858558] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-b23e363f-0735-42cc-a837-0ff6756fe15a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3076.868758] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.868758] nova-compute[62208]: warnings.warn( [ 3076.874618] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Waiting for the task: (returnval){ [ 3076.874618] nova-compute[62208]: value = "task-38767" [ 3076.874618] nova-compute[62208]: _type = "Task" [ 3076.874618] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3076.878059] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3076.878059] nova-compute[62208]: warnings.warn( [ 3076.884420] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Task: {'id': task-38767, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3077.379347] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3077.379347] nova-compute[62208]: warnings.warn( [ 3077.385719] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Task: {'id': task-38767, 'name': ReconfigVM_Task, 'duration_secs': 0.104457} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 3077.386301] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Reconfigured VM instance to enable vnc on port - 5901 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 3077.386694] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.560s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3077.387091] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 3077.387362] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 3077.387795] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 3077.388201] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c0f5c18e-3b26-462e-9390-75bf2ca1cbef {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3077.390006] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3077.390006] nova-compute[62208]: warnings.warn( [ 3077.393568] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Waiting for the task: (returnval){ [ 3077.393568] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52fae21c-be1e-0519-35c5-e1a3e64c82fe" [ 3077.393568] nova-compute[62208]: _type = "Task" [ 3077.393568] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3077.397069] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3077.397069] nova-compute[62208]: warnings.warn( [ 3077.402734] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52fae21c-be1e-0519-35c5-e1a3e64c82fe, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3077.897924] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3077.897924] nova-compute[62208]: warnings.warn( [ 3077.905383] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3077.905898] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 3077.906241] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 3078.141305] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3078.152125] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3078.152586] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3078.152912] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3078.153197] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 3078.154451] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef81164f-f64c-4f4a-aaa3-3b48b14b698c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3078.157384] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3078.157384] nova-compute[62208]: warnings.warn( [ 3078.163926] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72e84733-47f1-4390-b510-e9d23885969e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3078.167811] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3078.167811] nova-compute[62208]: warnings.warn( [ 3078.178750] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fc1c9a7-3c22-40e7-a1ba-869805be4b13 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3078.181396] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3078.181396] nova-compute[62208]: warnings.warn( [ 3078.186035] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecf9d2ee-d092-4d77-8427-2bcf656bcd85 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3078.189226] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3078.189226] nova-compute[62208]: warnings.warn( [ 3078.216852] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181951MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 3078.217261] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3078.217647] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3078.261080] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance edf06ba6-b54c-4ecd-8dab-2f7757ec2e68 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 3078.261245] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 313c553e-5e99-47b7-9ab3-7517df11aa15 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 3078.261428] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 3078.261579] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 3078.300535] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ac9beb9-0dcc-412c-ba15-3cba48041fe9 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3078.303083] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3078.303083] nova-compute[62208]: warnings.warn( [ 3078.308501] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b333019e-5a6c-4a21-b3ef-30d8b1477cfa {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3078.311574] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3078.311574] nova-compute[62208]: warnings.warn( [ 3078.337747] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-177facf2-e7da-4e66-85a6-808062efabf0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3078.340182] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3078.340182] nova-compute[62208]: warnings.warn( [ 3078.345250] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5508b402-d62b-4c3b-b5f3-0b1020545f19 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3078.348952] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3078.348952] nova-compute[62208]: warnings.warn( [ 3078.358608] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 3078.369046] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 3078.384255] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 3078.384507] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3086.380260] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3120.350420] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 3120.350420] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 3120.350420] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 3120.350420] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 3120.350420] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 3120.350420] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 3120.350420] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 3120.350420] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 3120.350420] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 3120.350420] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 3120.350420] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 3120.350420] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 3120.351002] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/1f46a32d-29cd-442c-9da5-b58f56310cd3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 3120.352750] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 3120.353010] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Copying Virtual Disk [datastore2] vmware_temp/1f46a32d-29cd-442c-9da5-b58f56310cd3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/1f46a32d-29cd-442c-9da5-b58f56310cd3/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 3120.353304] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a7953275-38fe-42f0-91ba-ab64f5546a77 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3120.355728] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3120.355728] nova-compute[62208]: warnings.warn( [ 3120.362249] nova-compute[62208]: DEBUG oslo_vmware.api [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Waiting for the task: (returnval){ [ 3120.362249] nova-compute[62208]: value = "task-38768" [ 3120.362249] nova-compute[62208]: _type = "Task" [ 3120.362249] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3120.365332] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3120.365332] nova-compute[62208]: warnings.warn( [ 3120.370681] nova-compute[62208]: DEBUG oslo_vmware.api [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': task-38768, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3120.868444] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3120.868444] nova-compute[62208]: warnings.warn( [ 3120.874652] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 3120.874947] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3120.875507] nova-compute[62208]: Faults: ['InvalidArgument'] [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Traceback (most recent call last): [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] yield resources [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] self.driver.spawn(context, instance, image_meta, [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] self._vmops.spawn(context, instance, image_meta, injected_files, [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] self._fetch_image_if_missing(context, vi) [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] image_cache(vi, tmp_image_ds_loc) [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] vm_util.copy_virtual_disk( [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] session._wait_for_task(vmdk_copy_task) [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] return self.wait_for_task(task_ref) [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] return evt.wait() [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] result = hub.switch() [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] return self.greenlet.switch() [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] self.f(*self.args, **self.kw) [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] raise exceptions.translate_fault(task_info.error) [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Faults: ['InvalidArgument'] [ 3120.875507] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] [ 3120.876311] nova-compute[62208]: INFO nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Terminating instance [ 3120.878043] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 3120.878043] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 3120.878043] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9c4b2cf8-a7c5-4503-9601-5f0b05179558 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3120.880194] nova-compute[62208]: DEBUG nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 3120.880383] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 3120.881192] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b96748e8-09f9-4ce9-a305-dd689d4e6e1e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3120.883852] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3120.883852] nova-compute[62208]: warnings.warn( [ 3120.884236] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3120.884236] nova-compute[62208]: warnings.warn( [ 3120.888957] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 3120.889281] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-545d759d-b196-49e4-9cb3-84f3504346f7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3120.891738] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 3120.891916] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 3120.892512] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3120.892512] nova-compute[62208]: warnings.warn( [ 3120.892934] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b1383c37-c4be-4307-9ce5-b384749bfcbe {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3120.895018] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3120.895018] nova-compute[62208]: warnings.warn( [ 3120.898266] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Waiting for the task: (returnval){ [ 3120.898266] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52ee1ef2-1cd1-c6e2-0f77-e7865205dab8" [ 3120.898266] nova-compute[62208]: _type = "Task" [ 3120.898266] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3120.901471] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3120.901471] nova-compute[62208]: warnings.warn( [ 3120.908400] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]52ee1ef2-1cd1-c6e2-0f77-e7865205dab8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3120.964676] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 3120.964951] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 3120.965156] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Deleting the datastore file [datastore2] edf06ba6-b54c-4ecd-8dab-2f7757ec2e68 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 3120.965429] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-71e2d1de-f6ae-474d-b944-0e87992dbde1 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3120.967679] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3120.967679] nova-compute[62208]: warnings.warn( [ 3120.972476] nova-compute[62208]: DEBUG oslo_vmware.api [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Waiting for the task: (returnval){ [ 3120.972476] nova-compute[62208]: value = "task-38770" [ 3120.972476] nova-compute[62208]: _type = "Task" [ 3120.972476] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3120.975877] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3120.975877] nova-compute[62208]: warnings.warn( [ 3120.984087] nova-compute[62208]: DEBUG oslo_vmware.api [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': task-38770, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3121.402031] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3121.402031] nova-compute[62208]: warnings.warn( [ 3121.408349] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 3121.408640] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Creating directory with path [datastore2] vmware_temp/c086e77a-4fb7-419e-a9bf-b4c22f4073a4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 3121.408875] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0b204583-21f4-4500-9c67-d14b765b70a8 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3121.410820] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3121.410820] nova-compute[62208]: warnings.warn( [ 3121.420829] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Created directory with path [datastore2] vmware_temp/c086e77a-4fb7-419e-a9bf-b4c22f4073a4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 3121.421032] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Fetch image to [datastore2] vmware_temp/c086e77a-4fb7-419e-a9bf-b4c22f4073a4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 3121.421203] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/c086e77a-4fb7-419e-a9bf-b4c22f4073a4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 3121.421963] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f63545a-28f3-4389-bcc1-81031c89c698 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3121.424285] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3121.424285] nova-compute[62208]: warnings.warn( [ 3121.428777] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f74a74a-bef2-439a-ad91-0667d0a03937 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3121.430955] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3121.430955] nova-compute[62208]: warnings.warn( [ 3121.437754] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d22266ab-b88c-4f69-8b65-16b85a6e82db {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3121.441342] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3121.441342] nova-compute[62208]: warnings.warn( [ 3121.469117] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0cc42fd-dd11-4189-aa69-5ff0f1808b17 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3121.471572] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3121.471572] nova-compute[62208]: warnings.warn( [ 3121.477562] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3121.477562] nova-compute[62208]: warnings.warn( [ 3121.478066] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e7e756ae-e119-44ca-a0d7-9e48ef428185 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3121.479614] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3121.479614] nova-compute[62208]: warnings.warn( [ 3121.482690] nova-compute[62208]: DEBUG oslo_vmware.api [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': task-38770, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08351} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 3121.482970] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 3121.483165] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 3121.483337] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 3121.483510] nova-compute[62208]: INFO nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Took 0.60 seconds to destroy the instance on the hypervisor. [ 3121.485645] nova-compute[62208]: DEBUG nova.compute.claims [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Aborting claim: <nova.compute.claims.Claim object at 0x7fb93583c790> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 3121.485817] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3121.486033] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3121.501018] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 3121.548273] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c086e77a-4fb7-419e-a9bf-b4c22f4073a4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 3121.604422] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 3121.604607] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c086e77a-4fb7-419e-a9bf-b4c22f4073a4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 3121.614358] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57102316-513c-46de-8cca-6582e5678b5c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3121.616898] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3121.616898] nova-compute[62208]: warnings.warn( [ 3121.622222] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c971a317-1077-4c8f-8d9b-bc035f60352f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3121.625523] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3121.625523] nova-compute[62208]: warnings.warn( [ 3121.652250] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-149a1eae-61cf-4c1a-8d0f-86eb94ab5157 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3121.654626] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3121.654626] nova-compute[62208]: warnings.warn( [ 3121.660158] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4ef55df-60c9-4e09-ad64-939b3d706b01 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3121.663951] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3121.663951] nova-compute[62208]: warnings.warn( [ 3121.673747] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 3121.682851] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 3121.699718] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.213s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3121.700303] nova-compute[62208]: Faults: ['InvalidArgument'] [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Traceback (most recent call last): [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] self.driver.spawn(context, instance, image_meta, [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] self._vmops.spawn(context, instance, image_meta, injected_files, [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] self._fetch_image_if_missing(context, vi) [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] image_cache(vi, tmp_image_ds_loc) [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] vm_util.copy_virtual_disk( [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] session._wait_for_task(vmdk_copy_task) [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] return self.wait_for_task(task_ref) [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] return evt.wait() [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] result = hub.switch() [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] return self.greenlet.switch() [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] self.f(*self.args, **self.kw) [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] raise exceptions.translate_fault(task_info.error) [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Faults: ['InvalidArgument'] [ 3121.700303] nova-compute[62208]: ERROR nova.compute.manager [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] [ 3121.701281] nova-compute[62208]: DEBUG nova.compute.utils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 3121.702650] nova-compute[62208]: DEBUG nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Build of instance edf06ba6-b54c-4ecd-8dab-2f7757ec2e68 was re-scheduled: A specified parameter was not correct: fileType [ 3121.702650] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 3121.703043] nova-compute[62208]: DEBUG nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 3121.703215] nova-compute[62208]: DEBUG nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 3121.703385] nova-compute[62208]: DEBUG nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 3121.703546] nova-compute[62208]: DEBUG nova.network.neutron [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 3121.944014] nova-compute[62208]: DEBUG nova.network.neutron [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 3121.956945] nova-compute[62208]: INFO nova.compute.manager [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: edf06ba6-b54c-4ecd-8dab-2f7757ec2e68] Took 0.25 seconds to deallocate network for instance. [ 3122.064937] nova-compute[62208]: INFO nova.scheduler.client.report [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Deleted allocations for instance edf06ba6-b54c-4ecd-8dab-2f7757ec2e68 [ 3122.085642] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-48784b7a-7ad2-4557-9086-07163126bb55 tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "edf06ba6-b54c-4ecd-8dab-2f7757ec2e68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 91.708s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3123.949780] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3123.950260] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3123.961488] nova-compute[62208]: DEBUG nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Starting instance... {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2423}} [ 3124.015952] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3124.016229] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3124.018287] nova-compute[62208]: INFO nova.compute.claims [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 3124.105135] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa66a40e-2b81-487c-bebe-ebfa7499e005 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3124.107759] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3124.107759] nova-compute[62208]: warnings.warn( [ 3124.113848] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c14855d4-2bf2-44a1-a9d6-b433b5e0b81e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3124.117001] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3124.117001] nova-compute[62208]: warnings.warn( [ 3124.145406] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3124.145564] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 3124.145662] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 3124.147586] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f117872-2c10-48c5-8ae0-7f73b088608b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3124.150225] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3124.150225] nova-compute[62208]: warnings.warn( [ 3124.155388] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f40bc154-239b-412b-bdcb-7393992f4bce {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3124.160564] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 3124.160741] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 3124.160875] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 3124.161002] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3124.161002] nova-compute[62208]: warnings.warn( [ 3124.170749] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 3124.178935] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 3124.194620] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.178s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3124.195085] nova-compute[62208]: DEBUG nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Start building networks asynchronously for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2820}} [ 3124.228478] nova-compute[62208]: DEBUG nova.compute.utils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Using /dev/sd instead of None {{(pid=62208) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 3124.229794] nova-compute[62208]: DEBUG nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Allocating IP information in the background. {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 3124.229967] nova-compute[62208]: DEBUG nova.network.neutron [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] allocate_for_instance() {{(pid=62208) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 3124.241213] nova-compute[62208]: DEBUG nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Start building block device mappings for instance. {{(pid=62208) _build_resources /opt/stack/nova/nova/compute/manager.py:2855}} [ 3124.279078] nova-compute[62208]: DEBUG nova.policy [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f5b1c4d5f18f41d6abdba4aa0c25c4d8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a1485ca2a9104710b083712d1c2db0a3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62208) authorize /opt/stack/nova/nova/policy.py:203}} [ 3124.313456] nova-compute[62208]: DEBUG nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Start spawning the instance on the hypervisor. {{(pid=62208) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2629}} [ 3124.333947] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-02-06T09:04:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9cb04cab48cfe256ad81b0c491c612c4',container_format='bare',created_at=2024-02-06T09:03:59Z,direct_url=<?>,disk_format='vmdk',id=77df2b34-a7d7-43a1-a59a-01f7474c0cf7,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-ide',owner='9c0f4e083006475493ed3f4bd4d93da7',properties=ImageMetaProps,protected=<?>,size=50659328,status='active',tags=<?>,updated_at=2024-02-06T09:04:00Z,virtual_size=<?>,visibility=<?>), allow threads: False {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 3124.334283] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Flavor limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 3124.334464] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Image limits 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 3124.335317] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Flavor pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 3124.335483] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Image pref 0:0:0 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 3124.335641] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62208) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 3124.335866] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 3124.336097] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 3124.336282] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Got 1 possible topologies {{(pid=62208) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 3124.336453] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 3124.336637] nova-compute[62208]: DEBUG nova.virt.hardware [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62208) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 3124.337703] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6358f0fc-61d9-4c80-9192-c2422eb1a679 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3124.340166] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3124.340166] nova-compute[62208]: warnings.warn( [ 3124.345888] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5528edcc-5670-4054-88f2-beaf90998d81 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3124.351258] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3124.351258] nova-compute[62208]: warnings.warn( [ 3124.564432] nova-compute[62208]: DEBUG nova.network.neutron [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Successfully created port: 755506d4-ed0f-4753-b37c-9cd11ff94386 {{(pid=62208) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 3125.031865] nova-compute[62208]: DEBUG nova.compute.manager [req-18cbba4c-0d13-4ca1-89f8-c7252b800c1d req-b6e56d54-bd09-4ef1-9db5-ac50d90eff5e service nova] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Received event network-vif-plugged-755506d4-ed0f-4753-b37c-9cd11ff94386 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 3125.032215] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-18cbba4c-0d13-4ca1-89f8-c7252b800c1d req-b6e56d54-bd09-4ef1-9db5-ac50d90eff5e service nova] Acquiring lock "cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3125.032415] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-18cbba4c-0d13-4ca1-89f8-c7252b800c1d req-b6e56d54-bd09-4ef1-9db5-ac50d90eff5e service nova] Lock "cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3125.032586] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-18cbba4c-0d13-4ca1-89f8-c7252b800c1d req-b6e56d54-bd09-4ef1-9db5-ac50d90eff5e service nova] Lock "cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3125.032820] nova-compute[62208]: DEBUG nova.compute.manager [req-18cbba4c-0d13-4ca1-89f8-c7252b800c1d req-b6e56d54-bd09-4ef1-9db5-ac50d90eff5e service nova] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] No waiting events found dispatching network-vif-plugged-755506d4-ed0f-4753-b37c-9cd11ff94386 {{(pid=62208) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 3125.032992] nova-compute[62208]: WARNING nova.compute.manager [req-18cbba4c-0d13-4ca1-89f8-c7252b800c1d req-b6e56d54-bd09-4ef1-9db5-ac50d90eff5e service nova] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Received unexpected event network-vif-plugged-755506d4-ed0f-4753-b37c-9cd11ff94386 for instance with vm_state building and task_state spawning. [ 3125.104250] nova-compute[62208]: DEBUG nova.network.neutron [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Successfully updated port: 755506d4-ed0f-4753-b37c-9cd11ff94386 {{(pid=62208) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 3125.115280] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "refresh_cache-cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 3125.115520] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquired lock "refresh_cache-cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 3125.115756] nova-compute[62208]: DEBUG nova.network.neutron [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 3125.152746] nova-compute[62208]: DEBUG nova.network.neutron [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 3125.311561] nova-compute[62208]: DEBUG nova.network.neutron [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Updating instance_info_cache with network_info: [{"id": "755506d4-ed0f-4753-b37c-9cd11ff94386", "address": "fa:16:3e:82:83:ce", "network": {"id": "88eb8543-19a3-430b-8862-5d8f5e5d3f26", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2001032116-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a1485ca2a9104710b083712d1c2db0a3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39cd75b0-9ec7-48ed-b57f-34da0c573a60", "external-id": "nsx-vlan-transportzone-751", "segmentation_id": 751, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap755506d4-ed", "ovs_interfaceid": "755506d4-ed0f-4753-b37c-9cd11ff94386", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 3125.324944] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Releasing lock "refresh_cache-cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3125.325252] nova-compute[62208]: DEBUG nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Instance network_info: |[{"id": "755506d4-ed0f-4753-b37c-9cd11ff94386", "address": "fa:16:3e:82:83:ce", "network": {"id": "88eb8543-19a3-430b-8862-5d8f5e5d3f26", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2001032116-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a1485ca2a9104710b083712d1c2db0a3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39cd75b0-9ec7-48ed-b57f-34da0c573a60", "external-id": "nsx-vlan-transportzone-751", "segmentation_id": 751, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap755506d4-ed", "ovs_interfaceid": "755506d4-ed0f-4753-b37c-9cd11ff94386", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62208) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1987}} [ 3125.325667] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:82:83:ce', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '39cd75b0-9ec7-48ed-b57f-34da0c573a60', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '755506d4-ed0f-4753-b37c-9cd11ff94386', 'vif_model': 'e1000'}] {{(pid=62208) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 3125.336000] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 3125.336620] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Creating VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 3125.336861] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b6578db5-0607-409c-9f8d-5d234e7b9a12 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3125.351610] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3125.351610] nova-compute[62208]: warnings.warn( [ 3125.358105] nova-compute[62208]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 3125.358105] nova-compute[62208]: value = "task-38771" [ 3125.358105] nova-compute[62208]: _type = "Task" [ 3125.358105] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3125.362986] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3125.362986] nova-compute[62208]: warnings.warn( [ 3125.368485] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38771, 'name': CreateVM_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3125.863612] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3125.863612] nova-compute[62208]: warnings.warn( [ 3125.869667] nova-compute[62208]: DEBUG oslo_vmware.api [-] Task: {'id': task-38771, 'name': CreateVM_Task, 'duration_secs': 0.293272} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 3125.869843] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Created VM on the ESX host {{(pid=62208) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 3125.870435] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "vmware.get_and_set_vnc_port" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3125.870666] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "vmware.get_and_set_vnc_port" acquired by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3125.873883] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad5235ab-5585-4d20-b4d4-e624a0b4c91e {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3125.883347] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3125.883347] nova-compute[62208]: warnings.warn( [ 3125.901705] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Reconfiguring VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1815}} [ 3125.902075] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-2831d3c0-ca91-4549-967d-349e5977cfa3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3125.912344] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3125.912344] nova-compute[62208]: warnings.warn( [ 3125.918927] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Waiting for the task: (returnval){ [ 3125.918927] nova-compute[62208]: value = "task-38772" [ 3125.918927] nova-compute[62208]: _type = "Task" [ 3125.918927] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3125.922189] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3125.922189] nova-compute[62208]: warnings.warn( [ 3125.927959] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': task-38772, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3126.140607] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3126.423299] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3126.423299] nova-compute[62208]: warnings.warn( [ 3126.429484] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': task-38772, 'name': ReconfigVM_Task, 'duration_secs': 0.107258} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 3126.429756] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Reconfigured VM instance to enable vnc on port - 5900 {{(pid=62208) _get_and_set_vnc_config /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1819}} [ 3126.429990] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "vmware.get_and_set_vnc_port" "released" by "nova.virt.vmwareapi.vmops.VMwareVMOps._get_and_set_vnc_config" :: held 0.559s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3126.430278] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 3126.430453] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 3126.430778] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 3126.431049] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f77ab635-04c8-408f-9195-ca2fa312fa3f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3126.432619] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3126.432619] nova-compute[62208]: warnings.warn( [ 3126.435827] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Waiting for the task: (returnval){ [ 3126.435827] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]521c0aef-d4e4-1d3d-8b76-2737c3e97c39" [ 3126.435827] nova-compute[62208]: _type = "Task" [ 3126.435827] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3126.439033] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3126.439033] nova-compute[62208]: warnings.warn( [ 3126.444020] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]521c0aef-d4e4-1d3d-8b76-2737c3e97c39, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3126.940265] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3126.940265] nova-compute[62208]: warnings.warn( [ 3126.946135] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3126.946397] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Processing image 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 3126.946613] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 3127.061513] nova-compute[62208]: DEBUG nova.compute.manager [req-e4b99d9f-ae31-4837-b363-56736b929064 req-db0895f1-416e-46ba-b873-133aafc9820e service nova] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Received event network-changed-755506d4-ed0f-4753-b37c-9cd11ff94386 {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11122}} [ 3127.061723] nova-compute[62208]: DEBUG nova.compute.manager [req-e4b99d9f-ae31-4837-b363-56736b929064 req-db0895f1-416e-46ba-b873-133aafc9820e service nova] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Refreshing instance network info cache due to event network-changed-755506d4-ed0f-4753-b37c-9cd11ff94386. {{(pid=62208) external_instance_event /opt/stack/nova/nova/compute/manager.py:11127}} [ 3127.061925] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-e4b99d9f-ae31-4837-b363-56736b929064 req-db0895f1-416e-46ba-b873-133aafc9820e service nova] Acquiring lock "refresh_cache-cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 3127.062071] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-e4b99d9f-ae31-4837-b363-56736b929064 req-db0895f1-416e-46ba-b873-133aafc9820e service nova] Acquired lock "refresh_cache-cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 3127.062239] nova-compute[62208]: DEBUG nova.network.neutron [req-e4b99d9f-ae31-4837-b363-56736b929064 req-db0895f1-416e-46ba-b873-133aafc9820e service nova] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Refreshing network info cache for port 755506d4-ed0f-4753-b37c-9cd11ff94386 {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 3127.141125] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3127.285299] nova-compute[62208]: DEBUG nova.network.neutron [req-e4b99d9f-ae31-4837-b363-56736b929064 req-db0895f1-416e-46ba-b873-133aafc9820e service nova] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Updated VIF entry in instance network info cache for port 755506d4-ed0f-4753-b37c-9cd11ff94386. {{(pid=62208) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 3127.285660] nova-compute[62208]: DEBUG nova.network.neutron [req-e4b99d9f-ae31-4837-b363-56736b929064 req-db0895f1-416e-46ba-b873-133aafc9820e service nova] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Updating instance_info_cache with network_info: [{"id": "755506d4-ed0f-4753-b37c-9cd11ff94386", "address": "fa:16:3e:82:83:ce", "network": {"id": "88eb8543-19a3-430b-8862-5d8f5e5d3f26", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2001032116-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}}], "meta": {"injected": false, "tenant_id": "a1485ca2a9104710b083712d1c2db0a3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39cd75b0-9ec7-48ed-b57f-34da0c573a60", "external-id": "nsx-vlan-transportzone-751", "segmentation_id": 751, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap755506d4-ed", "ovs_interfaceid": "755506d4-ed0f-4753-b37c-9cd11ff94386", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 3127.295055] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [req-e4b99d9f-ae31-4837-b363-56736b929064 req-db0895f1-416e-46ba-b873-133aafc9820e service nova] Releasing lock "refresh_cache-cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3129.138236] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3131.141175] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3134.142239] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3134.142537] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3134.142642] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 3135.141607] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3138.143387] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3138.153780] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3138.154012] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3138.154236] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3138.154409] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 3138.155971] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2192c67b-462d-4e5a-9323-7af27e0f6856 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3138.159472] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3138.159472] nova-compute[62208]: warnings.warn( [ 3138.165439] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16a11b11-5b2e-424c-a82c-7e3d7a41b949 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3138.168950] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3138.168950] nova-compute[62208]: warnings.warn( [ 3138.180584] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef495110-17e2-438c-834e-1fc234ebac35 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3138.182721] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3138.182721] nova-compute[62208]: warnings.warn( [ 3138.186818] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-254c60e6-c7f0-4e89-b051-a60b8bef8fe6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3138.189526] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3138.189526] nova-compute[62208]: warnings.warn( [ 3138.216278] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181967MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 3138.216555] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3138.216767] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3138.262844] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance 313c553e-5e99-47b7-9ab3-7517df11aa15 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 3138.263047] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 3138.263227] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 3138.263368] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 3138.299700] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11c122f3-bf86-4cb8-b3ef-c04abdf93f19 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3138.302208] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3138.302208] nova-compute[62208]: warnings.warn( [ 3138.307430] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-531f42bb-6171-4ab2-9205-6d519f10b64c {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3138.310637] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3138.310637] nova-compute[62208]: warnings.warn( [ 3138.338951] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77c30b9b-5783-4136-8874-5593a77ec415 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3138.341390] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3138.341390] nova-compute[62208]: warnings.warn( [ 3138.346780] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60fa6d89-8f13-4844-bbf8-9da5f9ee2d0d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3138.350445] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3138.350445] nova-compute[62208]: warnings.warn( [ 3138.359931] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 3138.368741] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 3138.385495] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 3138.385697] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.169s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3170.081481] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 3170.081481] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 3170.081481] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 3170.081481] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 3170.081481] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 3170.081481] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 3170.081481] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 3170.081481] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 3170.081481] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 3170.081481] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 3170.081481] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 3170.081481] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 3170.082412] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/c086e77a-4fb7-419e-a9bf-b4c22f4073a4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 3170.083641] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 3170.083883] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Copying Virtual Disk [datastore2] vmware_temp/c086e77a-4fb7-419e-a9bf-b4c22f4073a4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/c086e77a-4fb7-419e-a9bf-b4c22f4073a4/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 3170.084248] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6c487f72-ddc1-4b66-b55f-254facfd90d3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3170.086389] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3170.086389] nova-compute[62208]: warnings.warn( [ 3170.094215] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Waiting for the task: (returnval){ [ 3170.094215] nova-compute[62208]: value = "task-38773" [ 3170.094215] nova-compute[62208]: _type = "Task" [ 3170.094215] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3170.097618] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3170.097618] nova-compute[62208]: warnings.warn( [ 3170.102736] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Task: {'id': task-38773, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3170.598427] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3170.598427] nova-compute[62208]: warnings.warn( [ 3170.604511] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 3170.604818] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3170.605382] nova-compute[62208]: Faults: ['InvalidArgument'] [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Traceback (most recent call last): [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] yield resources [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] self.driver.spawn(context, instance, image_meta, [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] self._vmops.spawn(context, instance, image_meta, injected_files, [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] self._fetch_image_if_missing(context, vi) [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] image_cache(vi, tmp_image_ds_loc) [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] vm_util.copy_virtual_disk( [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] session._wait_for_task(vmdk_copy_task) [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] return self.wait_for_task(task_ref) [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] return evt.wait() [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] result = hub.switch() [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] return self.greenlet.switch() [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] self.f(*self.args, **self.kw) [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] raise exceptions.translate_fault(task_info.error) [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Faults: ['InvalidArgument'] [ 3170.605382] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] [ 3170.607024] nova-compute[62208]: INFO nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Terminating instance [ 3170.607292] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquired lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 3170.607538] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 3170.607792] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f53e775b-529b-4713-a608-83090360c239 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3170.609991] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Acquiring lock "refresh_cache-313c553e-5e99-47b7-9ab3-7517df11aa15" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 3170.610147] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Acquired lock "refresh_cache-313c553e-5e99-47b7-9ab3-7517df11aa15" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 3170.610313] nova-compute[62208]: DEBUG nova.network.neutron [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 3170.611898] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3170.611898] nova-compute[62208]: warnings.warn( [ 3170.618377] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 3170.618564] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62208) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 3170.619860] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-098110a4-1e79-407a-8012-9545a1a4d412 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3170.624226] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3170.624226] nova-compute[62208]: warnings.warn( [ 3170.627683] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Waiting for the task: (returnval){ [ 3170.627683] nova-compute[62208]: value = "session[52436ce8-c4cf-5b15-aa20-2021101f70ad]525e5017-aadc-a3a5-333a-3cfc629ac4ab" [ 3170.627683] nova-compute[62208]: _type = "Task" [ 3170.627683] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3170.630551] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3170.630551] nova-compute[62208]: warnings.warn( [ 3170.635703] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': session[52436ce8-c4cf-5b15-aa20-2021101f70ad]525e5017-aadc-a3a5-333a-3cfc629ac4ab, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3170.642235] nova-compute[62208]: DEBUG nova.network.neutron [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 3170.665273] nova-compute[62208]: DEBUG nova.network.neutron [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 3170.674203] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Releasing lock "refresh_cache-313c553e-5e99-47b7-9ab3-7517df11aa15" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3170.674700] nova-compute[62208]: DEBUG nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 3170.674906] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 3170.675967] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be31b5b2-01ca-48ed-95a8-e8c4bfa6cda4 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3170.678829] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3170.678829] nova-compute[62208]: warnings.warn( [ 3170.683968] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 3170.684257] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9d4a2ecd-2f7a-4b83-a6a7-63d7fe9dd2d6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3170.685635] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3170.685635] nova-compute[62208]: warnings.warn( [ 3170.711966] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 3170.712207] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 3170.712389] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Deleting the datastore file [datastore2] 313c553e-5e99-47b7-9ab3-7517df11aa15 {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 3170.712643] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3c571ee9-2d20-4125-ad3e-89418cab106f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3170.714418] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3170.714418] nova-compute[62208]: warnings.warn( [ 3170.718915] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Waiting for the task: (returnval){ [ 3170.718915] nova-compute[62208]: value = "task-38775" [ 3170.718915] nova-compute[62208]: _type = "Task" [ 3170.718915] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3170.723485] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3170.723485] nova-compute[62208]: warnings.warn( [ 3170.728169] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Task: {'id': task-38775, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3171.132294] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3171.132294] nova-compute[62208]: warnings.warn( [ 3171.138620] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Preparing fetch location {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 3171.138897] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Creating directory with path [datastore2] vmware_temp/0739bdfa-7af0-48f1-a929-6da64beb0946/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 3171.139174] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2f59ac82-34ac-4f7f-b26d-21aa2419753b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3171.141312] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3171.141312] nova-compute[62208]: warnings.warn( [ 3171.151185] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Created directory with path [datastore2] vmware_temp/0739bdfa-7af0-48f1-a929-6da64beb0946/77df2b34-a7d7-43a1-a59a-01f7474c0cf7 {{(pid=62208) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 3171.151380] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Fetch image to [datastore2] vmware_temp/0739bdfa-7af0-48f1-a929-6da64beb0946/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 3171.151927] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to [datastore2] vmware_temp/0739bdfa-7af0-48f1-a929-6da64beb0946/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 3171.152694] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1282159f-ec2f-4de7-97a8-a5e015f222d7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3171.155044] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3171.155044] nova-compute[62208]: warnings.warn( [ 3171.159726] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69dd6d6f-b187-4056-9ba0-af435be1a8b3 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3171.161926] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3171.161926] nova-compute[62208]: warnings.warn( [ 3171.168857] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0ad460f-6b6e-4a8f-aa05-d8e41561bee0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3171.172405] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3171.172405] nova-compute[62208]: warnings.warn( [ 3171.200484] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dba56037-c440-47e5-a63e-7f5170964176 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3171.203009] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3171.203009] nova-compute[62208]: warnings.warn( [ 3171.207111] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fd70065d-fcce-46b0-9612-75579fe905cf {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3171.208836] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3171.208836] nova-compute[62208]: warnings.warn( [ 3171.222987] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3171.222987] nova-compute[62208]: warnings.warn( [ 3171.229223] nova-compute[62208]: DEBUG oslo_vmware.api [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Task: {'id': task-38775, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.040875} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 3171.230800] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 3171.231053] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 3171.231292] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 3171.231513] nova-compute[62208]: INFO nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Took 0.56 seconds to destroy the instance on the hypervisor. [ 3171.231802] nova-compute[62208]: DEBUG oslo.service.loopingcall [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. {{(pid=62208) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 3171.232263] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Downloading image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 3171.234317] nova-compute[62208]: DEBUG nova.compute.manager [-] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Skipping network deallocation for instance since networking was not requested. {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2276}} [ 3171.236569] nova-compute[62208]: DEBUG nova.compute.claims [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Aborting claim: <nova.compute.claims.Claim object at 0x7fb935eb50f0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 3171.236772] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3171.237028] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3171.284251] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Creating HTTP connection to write to file with size = 50659328 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0739bdfa-7af0-48f1-a929-6da64beb0946/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 3171.341020] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Completed reading data from the image iterator. {{(pid=62208) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 3171.341231] nova-compute[62208]: DEBUG oslo_vmware.rw_handles [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0739bdfa-7af0-48f1-a929-6da64beb0946/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62208) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 3171.362735] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85f24c00-6ff8-48b9-9bc4-074ed420bb2d {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3171.365228] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3171.365228] nova-compute[62208]: warnings.warn( [ 3171.370279] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4112206-5a6f-46b3-b62a-54fdbcc077a7 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3171.373391] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3171.373391] nova-compute[62208]: warnings.warn( [ 3171.401302] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab5e9e04-6bd3-44ac-a043-a3d45b044170 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3171.403686] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3171.403686] nova-compute[62208]: warnings.warn( [ 3171.408673] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcafd3f3-3ef9-4708-a45c-d055e26fcd62 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3171.412371] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3171.412371] nova-compute[62208]: warnings.warn( [ 3171.421680] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 3171.430469] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 3171.445782] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.209s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3171.446335] nova-compute[62208]: Faults: ['InvalidArgument'] [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Traceback (most recent call last): [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] self.driver.spawn(context, instance, image_meta, [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] self._vmops.spawn(context, instance, image_meta, injected_files, [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] self._fetch_image_if_missing(context, vi) [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] image_cache(vi, tmp_image_ds_loc) [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] vm_util.copy_virtual_disk( [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] session._wait_for_task(vmdk_copy_task) [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] return self.wait_for_task(task_ref) [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] return evt.wait() [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] result = hub.switch() [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] return self.greenlet.switch() [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] self.f(*self.args, **self.kw) [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] raise exceptions.translate_fault(task_info.error) [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Faults: ['InvalidArgument'] [ 3171.446335] nova-compute[62208]: ERROR nova.compute.manager [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] [ 3171.447092] nova-compute[62208]: DEBUG nova.compute.utils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 3171.448548] nova-compute[62208]: DEBUG nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Build of instance 313c553e-5e99-47b7-9ab3-7517df11aa15 was re-scheduled: A specified parameter was not correct: fileType [ 3171.448548] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 3171.448995] nova-compute[62208]: DEBUG nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 3171.449221] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Acquiring lock "refresh_cache-313c553e-5e99-47b7-9ab3-7517df11aa15" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 3171.449370] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Acquired lock "refresh_cache-313c553e-5e99-47b7-9ab3-7517df11aa15" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 3171.449530] nova-compute[62208]: DEBUG nova.network.neutron [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Building network info cache for instance {{(pid=62208) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 3171.476579] nova-compute[62208]: DEBUG nova.network.neutron [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Instance cache missing network info. {{(pid=62208) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 3171.500546] nova-compute[62208]: DEBUG nova.network.neutron [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 3171.509731] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Releasing lock "refresh_cache-313c553e-5e99-47b7-9ab3-7517df11aa15" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3171.509957] nova-compute[62208]: DEBUG nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 3171.510139] nova-compute[62208]: DEBUG nova.compute.manager [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] [instance: 313c553e-5e99-47b7-9ab3-7517df11aa15] Skipping network deallocation for instance since networking was not requested. {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2276}} [ 3171.600299] nova-compute[62208]: INFO nova.scheduler.client.report [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Deleted allocations for instance 313c553e-5e99-47b7-9ab3-7517df11aa15 [ 3171.619184] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-6d269eb3-7040-4369-a3f7-64051be512f5 tempest-ServerShowV254Test-919470247 tempest-ServerShowV254Test-919470247-project-member] Lock "313c553e-5e99-47b7-9ab3-7517df11aa15" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 95.769s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3186.385070] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3186.385423] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 3186.385423] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 3186.396131] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Skipping network cache update for instance because it is Building. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9945}} [ 3186.396314] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 3186.397149] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3189.141990] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3190.135868] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3193.140726] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3194.141853] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3196.141484] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3196.141765] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3196.141925] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62208) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10551}} [ 3199.140704] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager.update_available_resource {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3199.151357] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3199.151581] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3199.151752] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3199.151914] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62208) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 3199.153047] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3da74b5d-0093-4ec8-9447-e1903805c8d6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3199.157047] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3199.157047] nova-compute[62208]: warnings.warn( [ 3199.163064] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1c6a44f-753a-4e9f-825b-7d430e2c563f {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3199.166722] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3199.166722] nova-compute[62208]: warnings.warn( [ 3199.176567] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b8e0311-753e-46af-8ced-25c8399cad17 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3199.178931] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3199.178931] nova-compute[62208]: warnings.warn( [ 3199.183003] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f95a811d-d822-4f79-a210-df14c3c62dc5 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3199.185638] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3199.185638] nova-compute[62208]: warnings.warn( [ 3199.210905] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181969MB free_disk=197GB free_vcpus=48 pci_devices=None {{(pid=62208) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 3199.211056] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3199.211238] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3199.248135] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Instance cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62208) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 3199.248341] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 3199.248486] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=62208) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 3199.274802] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc3845f9-edd8-47a0-9ed7-33d2525950c0 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3199.277229] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3199.277229] nova-compute[62208]: warnings.warn( [ 3199.282474] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0263820d-67cd-4035-84f2-4fea0ae4b834 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3199.285444] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3199.285444] nova-compute[62208]: warnings.warn( [ 3199.312816] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c76f1f9-8946-44fe-af4c-7742a840ebf6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3199.315023] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3199.315023] nova-compute[62208]: warnings.warn( [ 3199.320121] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61c59b58-e37a-412f-8989-9d4881ff83ff {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3199.323583] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3199.323583] nova-compute[62208]: warnings.warn( [ 3199.333824] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 3199.341852] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 3199.357609] nova-compute[62208]: DEBUG nova.compute.resource_tracker [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62208) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 3199.357792] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3207.355141] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3220.380426] nova-compute[62208]: WARNING oslo_vmware.rw_handles [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 3220.380426] nova-compute[62208]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 3220.380426] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 3220.380426] nova-compute[62208]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 3220.380426] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 3220.380426] nova-compute[62208]: ERROR oslo_vmware.rw_handles response.begin() [ 3220.380426] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 3220.380426] nova-compute[62208]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 3220.380426] nova-compute[62208]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 3220.380426] nova-compute[62208]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 3220.380426] nova-compute[62208]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 3220.380426] nova-compute[62208]: ERROR oslo_vmware.rw_handles [ 3220.381030] nova-compute[62208]: DEBUG nova.virt.vmwareapi.images [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Downloaded image file data 77df2b34-a7d7-43a1-a59a-01f7474c0cf7 to vmware_temp/0739bdfa-7af0-48f1-a929-6da64beb0946/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk on the data store datastore2 {{(pid=62208) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 3220.382775] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Caching image {{(pid=62208) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 3220.383025] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vm_util [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Copying Virtual Disk [datastore2] vmware_temp/0739bdfa-7af0-48f1-a929-6da64beb0946/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/tmp-sparse.vmdk to [datastore2] vmware_temp/0739bdfa-7af0-48f1-a929-6da64beb0946/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk {{(pid=62208) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 3220.383327] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a0466b0b-cef6-4dba-ab76-364b735eb840 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3220.385510] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3220.385510] nova-compute[62208]: warnings.warn( [ 3220.392954] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Waiting for the task: (returnval){ [ 3220.392954] nova-compute[62208]: value = "task-38776" [ 3220.392954] nova-compute[62208]: _type = "Task" [ 3220.392954] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3220.396456] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3220.396456] nova-compute[62208]: warnings.warn( [ 3220.401371] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': task-38776, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3220.897798] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3220.897798] nova-compute[62208]: warnings.warn( [ 3220.903755] nova-compute[62208]: DEBUG oslo_vmware.exceptions [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Fault InvalidArgument not matched. {{(pid=62208) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 3220.904046] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Releasing lock "[datastore2] devstack-image-cache_base/77df2b34-a7d7-43a1-a59a-01f7474c0cf7/77df2b34-a7d7-43a1-a59a-01f7474c0cf7.vmdk" {{(pid=62208) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3220.904579] nova-compute[62208]: Faults: ['InvalidArgument'] [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Traceback (most recent call last): [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/compute/manager.py", line 2885, in _build_resources [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] yield resources [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] self.driver.spawn(context, instance, image_meta, [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] self._fetch_image_if_missing(context, vi) [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] image_cache(vi, tmp_image_ds_loc) [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] vm_util.copy_virtual_disk( [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] session._wait_for_task(vmdk_copy_task) [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] return self.wait_for_task(task_ref) [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] return evt.wait() [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] result = hub.switch() [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] return self.greenlet.switch() [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] self.f(*self.args, **self.kw) [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] raise exceptions.translate_fault(task_info.error) [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Faults: ['InvalidArgument'] [ 3220.904579] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] [ 3220.905425] nova-compute[62208]: INFO nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Terminating instance [ 3220.907885] nova-compute[62208]: DEBUG nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Start destroying the instance on the hypervisor. {{(pid=62208) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3141}} [ 3220.908090] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Destroying instance {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 3220.908861] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d64d9569-7db6-48b9-a52a-14b2121981a6 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3220.911296] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3220.911296] nova-compute[62208]: warnings.warn( [ 3220.915377] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Unregistering the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 3220.915599] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-89f44595-1e71-45ec-b2f4-3eab27e0787a {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3220.917060] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3220.917060] nova-compute[62208]: warnings.warn( [ 3220.980829] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Unregistered the VM {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 3220.981157] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Deleting contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 3220.981412] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Deleting the datastore file [datastore2] cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 3220.981838] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-14a934eb-346c-4310-abe3-6cb050a94304 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3220.984816] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3220.984816] nova-compute[62208]: warnings.warn( [ 3220.990117] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Waiting for the task: (returnval){ [ 3220.990117] nova-compute[62208]: value = "task-38778" [ 3220.990117] nova-compute[62208]: _type = "Task" [ 3220.990117] nova-compute[62208]: } to complete. {{(pid=62208) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 3220.994733] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3220.994733] nova-compute[62208]: warnings.warn( [ 3221.001968] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': task-38778, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 3221.494859] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3221.494859] nova-compute[62208]: warnings.warn( [ 3221.500611] nova-compute[62208]: DEBUG oslo_vmware.api [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Task: {'id': task-38778, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070277} completed successfully. {{(pid=62208) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 3221.500856] nova-compute[62208]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Deleted the datastore file {{(pid=62208) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 3221.501038] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Deleted contents of the VM from datastore datastore2 {{(pid=62208) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 3221.501211] nova-compute[62208]: DEBUG nova.virt.vmwareapi.vmops [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Instance destroyed {{(pid=62208) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 3221.501387] nova-compute[62208]: INFO nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Took 0.59 seconds to destroy the instance on the hypervisor. [ 3221.503667] nova-compute[62208]: DEBUG nova.compute.claims [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Aborting claim: <nova.compute.claims.Claim object at 0x7fb935eb59f0> {{(pid=62208) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 3221.503847] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 3221.504127] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 3221.570412] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7496aedf-d085-4993-a119-ce049c1705cc {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3221.573264] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3221.573264] nova-compute[62208]: warnings.warn( [ 3221.578295] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b51fa883-cfb7-4f8e-8e92-d8f02bcaf31b {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3221.581346] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3221.581346] nova-compute[62208]: warnings.warn( [ 3221.610443] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f8b478b-f765-4f28-8b57-a16f3db0df37 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3221.612895] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3221.612895] nova-compute[62208]: warnings.warn( [ 3221.618330] nova-compute[62208]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf643036-f081-4ca4-908e-939195ec8272 {{(pid=62208) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 3221.622077] nova-compute[62208]: /opt/stack/data/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vc1.osci.c.eu-de-1.cloud.sap'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings [ 3221.622077] nova-compute[62208]: warnings.warn( [ 3221.631824] nova-compute[62208]: DEBUG nova.compute.provider_tree [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Inventory has not changed in ProviderTree for provider: 8d308854-9c5b-48ef-bafe-5c6c728e46d8 {{(pid=62208) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 3221.640457] nova-compute[62208]: DEBUG nova.scheduler.client.report [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Inventory has not changed for provider 8d308854-9c5b-48ef-bafe-5c6c728e46d8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 197, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62208) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 3221.656323] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.152s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3221.656891] nova-compute[62208]: Faults: ['InvalidArgument'] [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Traceback (most recent call last): [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/compute/manager.py", line 2632, in _build_and_run_instance [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] self.driver.spawn(context, instance, image_meta, [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] self._fetch_image_if_missing(context, vi) [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] image_cache(vi, tmp_image_ds_loc) [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] vm_util.copy_virtual_disk( [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] session._wait_for_task(vmdk_copy_task) [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] return self.wait_for_task(task_ref) [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] return evt.wait() [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 124, in wait [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] result = hub.switch() [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 310, in switch [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] return self.greenlet.switch() [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] self.f(*self.args, **self.kw) [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] raise exceptions.translate_fault(task_info.error) [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Faults: ['InvalidArgument'] [ 3221.656891] nova-compute[62208]: ERROR nova.compute.manager [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] [ 3221.657692] nova-compute[62208]: DEBUG nova.compute.utils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] VimFaultException {{(pid=62208) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 3221.659100] nova-compute[62208]: DEBUG nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Build of instance cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e was re-scheduled: A specified parameter was not correct: fileType [ 3221.659100] nova-compute[62208]: Faults: ['InvalidArgument'] {{(pid=62208) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2471}} [ 3221.659474] nova-compute[62208]: DEBUG nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Unplugging VIFs for instance {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} [ 3221.659687] nova-compute[62208]: DEBUG nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62208) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3020}} [ 3221.659883] nova-compute[62208]: DEBUG nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Deallocating network for instance {{(pid=62208) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2280}} [ 3221.660066] nova-compute[62208]: DEBUG nova.network.neutron [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] deallocate_for_instance() {{(pid=62208) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 3221.894392] nova-compute[62208]: DEBUG nova.network.neutron [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Updating instance_info_cache with network_info: [] {{(pid=62208) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 3221.909178] nova-compute[62208]: INFO nova.compute.manager [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] [instance: cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e] Took 0.25 seconds to deallocate network for instance. [ 3222.013161] nova-compute[62208]: INFO nova.scheduler.client.report [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Deleted allocations for instance cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e [ 3222.034433] nova-compute[62208]: DEBUG oslo_concurrency.lockutils [None req-cc68ae49-408f-4a4e-a6f1-9c1bfb7ebbee tempest-AttachVolumeTestJSON-1594482043 tempest-AttachVolumeTestJSON-1594482043-project-member] Lock "cd8f8d8c-b48f-4fe9-999a-2cd1f60aec3e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 98.084s {{(pid=62208) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 3246.143273] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3246.143273] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Starting heal instance info cache {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9932}} [ 3246.143273] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Rebuilding the list of instances to heal {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9936}} [ 3246.152122] nova-compute[62208]: DEBUG nova.compute.manager [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Didn't find any instances for network info cache update. {{(pid=62208) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10018}} [ 3248.142060] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3250.140694] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3251.136128] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 3253.140662] nova-compute[62208]: DEBUG oslo_service.periodic_task [None req-d1ff576b-bd20-4449-90c6-e59b641de6ec None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62208) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}}